A Zitterbewegung model of the neutron

As part of my ventures into QCD, I quickly developed a Zitterbewegung model of the neutron, as a complement to my first sketch of a deuteron nucleus. The math of orbitals is interesting. Whatever field you have, one can model is using a coupling constant between the proportionality coefficient of the force, and the charge it acts on. That ties it nicely with my earlier thoughts on the meaning of the fine-structure constant.

My realist interpretation of quantum physics focuses on explanations involving the electromagnetic force only, but the matter-antimatter dichotomy still puzzles me very much. Also, the idea of virtual particles is no longer anathema to me, but I still want to model them as particle-field interactions and the exchange of real (angular or linear) momentum and energy, with a quantization of momentum and energy obeying the Planck-Einstein law.

The proton model will be key. We cannot explain it in the typical ‘mass without mass’ model of zittering charges: we get a 1/4 factor in the explanation of the proton radius, which is impossible to get rid of unless we assume some ‘strong’ force come into play. That is why I prioritize a ‘straight’ attack on the electron and the proton-electron bond in a primitive neutron model.

The calculation of forces inside a muon-electron and a proton (see ) is an interesting exercise: it is the only thing which explains why an electron annihilates a positron but electrons and protons can live together (the ‘anti-matter’ nature of charged particles only shows because of opposite spin directions of the fields – so it is only when the ‘structure’ of matter-antimatter pairs is different that they will not annihilate each other).

[…]

In short, 2021 will be an interesting year for me. The intent of my last two papers (on the deuteron model and the primitive neutron model) was to think of energy values: the energy value of the bond between electron and proton in the neutron, and the energy value of the bond between proton and neutron in a deuteron nucleus. But, yes, the more fundamental work remains to be done !

Cheers – Jean-Louis

The complementarity of wave- and particle-like viewpoints on EM wave propagation

In 1995, W.E. Lamb Jr. wrote the following on the nature of the photon: “There is no such thing as a photon. Only a comedy of errors and historical accidents led to its popularity among physicists and optical scientists. I admit that the word is short and convenient. Its use is also habit forming. Similarly, one might find it convenient to speak of the “aether” or “vacuum” to stand for empty space, even if no such thing existed. There are very good substitute words for “photon”, (e.g., “radiation” or “light”), and for “photonics” (e.g., “optics” or “quantum optics”). Similar objections are possible to use of the word “phonon”, which dates from 1932. Objects like electrons, neutrinos of finite rest mass, or helium atoms can, under suitable conditions, be considered to be particles, since their theories then have viable non-relativistic and non-quantum limits.”[1]

The opinion of a Nobel Prize laureate carries some weight, of course, but we think the concept of a photon makes sense. As the electron moves from one (potential) energy state to another – from one atomic or molecular orbital to another – it builds an oscillating electromagnetic field which has an integrity of its own and, therefore, is not only wave-like but also particle-like.

We, therefore, dedicated the fifth chapter of our re-write of Feynman’s Lectures to a dual analysis of EM radiation (and, yes, this post is just an announcement of the paper so you are supposed to click the link to read it). It is, basically, an overview of a rather particular expression of Maxwell’s equations which Feynman uses to discuss the laws of radiation. I wonder how to – possibly – ‘transform’ or ‘transpose’ this framework so it might apply to deep electron orbitals and – possibly – proton-neutron oscillations.


[1] W.E. Lamb Jr., Anti-photon, in: Applied Physics B volume 60, pages 77–84 (1995).

Electron propagation in a lattice

It is done! My last paper on the mentioned topic (available on Phil Gibbs’s site, my ResearchGate page or academia.edu) should conclude my work on the QED sector. It is a thorough exploration of the hitherto mysterious concept of the effective mass and all that.

The result I got is actually very nice: my calculation of the order of magnitude of the kb factor in the formula for the energy band (the conduction band, as you may know it) shows that the usual small angle approximation of the formula does not make all that much sense. This shows that some ‘realist’ thinking about what is what in these quantum-mechanical models does constrain the options: we cannot just multiply wave numbers with some random multiple of π or 2π. These things have a physical meaning!

So no multiverses or many worlds, please! One world is enough, and it is nice we can map it to a unique mathematical description.

I should now move on and think about the fun stuff: what is going on in the nucleus and all that? Let’s see where we go from here. Downloads on ResearchGate have been going through the roof lately (a thousand reads on ResearchGate is better than ten thousand on viXra.org, I guess), so it is all very promising. 🙂

Understanding lasers, semiconductors and other technical stuff

I wrote a lot of papers but most of them – if not all – deal with very basic stuff: the meaning of uncertainty (just statistical indeterminacy because we have no information on the initial condition of the system), the Planck-Einstein relation (how Planck’s quantum of action models an elementary cycle or an oscillation), and Schrödinger’s wavefunctions (the solutions to his equation) as the equations of motion for a pointlike charge. If anything, I hope I managed to restore a feeling that quantum electrodynamics is not essentially different from classical physics: it just adds the element of a quantization – of energy, momentum, magnetic flux, etcetera.

Importantly, we also talked about what photons and electrons actually are, and that electrons are pointlike but not dimensionless: their magnetic moment results from an internal current and, hence, spin is something real – something we can explain in terms of a two-dimensional perpetual current. In the process, we also explained why electrons take up some space: they have a radius (the Compton radius). So that explains the quantization of space, if you want.

We also talked fields and told you – because matter-particles do have a structure – we should have a dynamic view of the fields surrounding those. Potential barriers – or their corollary: potential wells – should, therefore, not be thought of as static fields. They result from one or more charges moving around and these fields, therefore, vary in time. Hence, a particle breaking through a ‘potential wall’ or coming out of a potential ‘well’ is just using an opening, so to speak, which corresponds to a classical trajectory.

We, therefore, have the guts to say that some of what you will read in a standard textbook is plain nonsense. Richard Feynman, for example, starts his lecture on a current in a crystal lattice by writing this: “You would think that a low-energy electron would have great difficulty passing through a solid crystal. The atoms are packed together with their centers only a few angstroms apart, and the effective diameter of the atom for electron scattering is roughly an angstrom or so. That is, the atoms are large, relative to their spacing, so that you would expect the mean free path between collisions to be of the order of a few angstroms—which is practically nothing. You would expect the electron to bump into one atom or another almost immediately. Nevertheless, it is a ubiquitous phenomenon of nature that if the lattice is perfect, the electrons are able to travel through the crystal smoothly and easily—almost as if they were in a vacuum. This strange fact is what lets metals conduct electricity so easily; it has also permitted the development of many practical devices. It is, for instance, what makes it possible for a transistor to imitate the radio tube. In a radio tube electrons move freely through a vacuum, while in the transistor they move freely through a crystal lattice.” [The italics are mine.]

It is nonsense because it is not the electron that is traveling smoothly, easily or freely: it is the electrical signal, and – no ! – that is not to be equated with the quantum-mechanical amplitude. The quantum-mechanical amplitude is just a mathematical concept: it does not travel through the lattice in any physical sense ! In fact, it does not even travel through the lattice in a logical sense: the quantum-mechanical amplitudes are to be associated with the atoms in the crystal lattice, and describe their state – i.e. whether or not they have an extra electron or (if we are analyzing electron holes in the lattice) if they are lacking one. So the drift velocity of the electron is actually very low, and the way the signal moves through the lattice is just like in the game of musical chairs – but with the chairs on a line: all players agree to kindly move to the next chair for the new arrival so the last person on the last chair can leave the game to get a beer. So here it is the same: one extra electron causes all other electrons to move. [For more detail, we refer to our paper on matter-waves, amplitudes and signals.]

But so, yes, we have not said much about semiconductors, lasers and other technical stuff. Why not? Not because it should be difficult: we already cracked the more difficult stuff (think of an explanation of the anomalous magnetic moment, the Lamb shift, or one-photon Mach-Zehnder interference here). No. We are just lacking time ! It is, effectively, going to be an awful lot of work to rewrite those basic lectures on semiconductors – or on lasers or other technical matters which attract students in physics – so as to show why and how the mechanics of these things actually work: not approximately, but how exactly – and, more importantly, why and how these phenomena can be explained in terms of something real: actual electrons moving through the lattice at lower or higher drift speeds within a conduction band (and then what that conduction band actually is).

The same goes for lasers: we talk about induced emission and all that, but we need to explain what that might actually represent – while avoiding the usual mumbo-jumbo about bosonic behavior and other useless generalizations of properties of actually matter- and light-particles that can be reasonably explained in terms of the structure of these particles – instead of invoking quantum-mechanical theorems or other dogmatic or canonical a priori assumptions.

So, yes, it is going to be hard work – and I am not quite sure if I have sufficient time or energy for it. I will try, and so I will probably be offline for quite some time while doing that. Be sure to have fun in the meanwhile ! 🙂

Post scriptum: Perhaps I should also focus on converting some of my papers into journal articles, but then I don’t feel like it’s worth going through all of the trouble that takes. Academic publishing is a weird thing. Either the editorial line of the journal is very strong, in which case they do not want to publish non-mainstream theory, and also insist on introductions and other credentials, or, else, it is very weak or even absent – and then it is nothing more than vanity or ego, right? So I think I am just fine with the viXra collection and the ‘preprint’ papers on ResearchGate now. I’ve been thinking it allows me to write what I want and – equally important – how I want to write it. In any case, I am writing for people like you and me. Not so much for dogmatic academics or philosophers. The poor experience with reviewers of my manuscript has taught me well, I guess. I should probably wait to get an invitation to publish now.

Quantum Physics: A Survivor’s Guide

A few days ago, I mentioned I felt like writing a new book: a sort of guidebook for amateur physicists like me. I realized that is actually fairly easy to do. I have three very basic papers – one on particles (both light and matter), one on fields, and one on the quantum-mechanical toolbox (amplitude math and all of that). But then there is a lot of nitty-gritty to be written about the technical stuff, of course: self-interference, superconductors, the behavior of semiconductors (as used in transistors), lasers, and so many other things – and all of the math that comes with it. However, for that, I can refer you to Feynman’s three volumes of lectures, of course. In fact, I should: it’s all there. So… Well… That’s it, then. I am done with the QED sector. Here is my summary of it all (links to the papers on Phil Gibbs’ site):

Paper I: Quantum behavior (the abstract should enrage the dark forces)

Paper II: Probability amplitudes (quantum math)

Paper III: The concept of a field (why you should not bother about QFT)

Paper IV: Survivor’s guide to all of the rest (keep smiling)

Paper V: Uncertainty and the geometry of the wavefunction (the final!)

The last paper is interesting because it shows statistical indeterminism is the only real indeterminism. We can, therefore, use Bell’s Theorem to prove our theory is complete: there is no need for hidden variables, so why should we bother about trying to prove or disprove they can or cannot exist?

Jean Louis Van Belle, 21 October 2020

Note: As for the QCD sector, that is a mess. We might have to wait another hundred years or so to see the smoke clear up there. Or, who knows, perhaps some visiting alien(s) will come and give us a decent alternative for the quark hypothesis and quantum field theories. One of my friends thinks so. Perhaps I should trust him more. 🙂

As for Phil Gibbs, I should really thank him for being one of the smartest people on Earth – and for his site, of course. Brilliant forum. Does what Feynman wanted everyone to do: look at the facts, and think for yourself. 🙂

The concept of a field

I ended my post on particles as spacetime oscillations saying I should probably write something about the concept of a field too, and why and how many academic physicists abuse it so often. So I did that, but it became a rather lengthy paper, and so I will refer you to Phil Gibbs’ site, where I post such stuff. Here is the link. Let me know what you think of it.

As for how it fits in with the rest of my writing, I already jokingly rewrote two of Feynman’s introductory Lectures on quantum mechanics (see: Quantum Behavior and Probability Amplitudes). I consider this paper to be the third. 🙂

Post scriptum: Now that I am talking about Richard Feynman – again ! – I should add that I really think of him as a weird character. I think he himself got caught in that image of the ‘Great Teacher’ while, at the same (and, surely, as a Nobel laureate), he also had to be seen to a ‘Great Guru.’ Read: a Great Promoter of the ‘Grand Mystery of Quantum Mechanics’ – while he probably knew classical electromagnetism combined with the Planck-Einstein relation can explain it all… Indeed, his lecture on superconductivity starts off as an incoherent ensemble of ‘rocket science’ pieces, to then – in the very last paragraphs – manipulate Schrödinger’s equation (and a few others) to show superconducting currents are just what you would expect in a superconducting fluid. Let me quote him:

“Schrödinger’s equation for the electron pairs in a superconductor gives us the equations of motion of an electrically charged ideal fluid. Superconductivity is the same as the problem of the hydrodynamics of a charged liquid. If you want to solve any problem about superconductors you take these equations for the fluid [or the equivalent pair, Eqs. (21.32) and (21.33)], and combine them with Maxwell’s equations to get the fields.”

So… Well… Looks he too is all about impressing people with ‘rocket science models’ first, and then he simplifies it all to… Well… Something simple. 😊

Having said that, I still like Feynman more than modern science gurus, because the latter usually don’t get to the simplifying part. :-/

A new book?

I don’t know where I would start a new story on physics. I am also not quite sure for whom I would be writing it – although it would be for people like me, obviously: most of what we do, we do for ourselves, right? So I should probably describe myself in order to describe the audience: amateur physicists who are interested in the epistemology of modern physics – or its ontology, or its metaphysics. I also talk about the genealogy or archaeology of ideas on my ResearchGate site. All these words have (slightly) different meanings but the distinctions do not matter all that much. The point is this: I write for people who want to understand physics in pretty much the same way as the great classical physicist Hendrik Antoon Lorentz who, just a few months before his demise, at the occasion of the (in)famous 1927 Solvay Conference, wanted to understand the ‘new theories’:

“We are representing phenomena. We try to form an image of them in our mind. Till now, we always tried to do using the ordinary notions of space and time. These notions may be innate; they result, in any case, from our personal experience, from our daily observations. To me, these notions are clear, and I admit I am not able to have any idea about physics without those notions. The image I want to have when thinking physical phenomena has to be clear and well defined, and it seems to me that cannot be done without these notions of a system defined in space and in time.”

Note that H.A. Lorentz understood electromagnetism and relativity theory as few others did. In fact, judging from some of the crap out there, I can safely say he understood stuff as few others do today still. Hence, he should surely not be thought of as a classical physicist who, somehow, was stuck. On the contrary: he understood the ‘new theories’ better than many of the new theorists themselves. In fact, as far as I am concerned, I think his comments or conclusions on the epistemological status of the Uncertainty Principle – which he made in the same intervention – still stand. Let me quote the original French:

“Je pense que cette notion de probabilité [in the new theories] serait à mettre à la fin, et comme conclusion, des considérations théoriques, et non pas comme axiome a priori, quoique je veuille bien admettre que cette indétermination correspond aux possibilités expérimentales. Je pourrais toujours garder ma foi déterministe pour les phénomènes fondamentaux, dont je n’ai pas parlé. Est-ce qu’un esprit plus profond ne pourrait pas se rendre compte des mouvements de ces électrons. Ne pourrait-on pas garder le déterminisme en en faisant l’objet d’une croyance? Faut-il nécessairement ériger l’ indéterminisme en principe?”

What a beautiful statement, isn’t it? Why should we elevate indeterminism to a philosophical principle? Indeed, now that I’ve inserted some French, I may as well inject some German. The idea of a particle includes the idea of a more or less well-known position. Let us be specific and think of uncertainty in the context of position. We may not fully know the position of a particle for one or more of the following reasons:

  1. The precision of our measurements may be limited: this is what Heisenberg referred to as an Ungenauigkeit.
  2. Our measurement might disturb the position and, as such, cause the information to get lost and, as a result, introduce an uncertainty: this is what we may translate as an Unbestimmtheit.
  3. The uncertainty may be inherent to Nature, in which case we should probably refer to it as an Ungewissheit.

So what is the case? Lorentz claims it is either the first or the second – or a combination of both – and that the third proposition is a philosophical statement which we can neither prove nor disprove. I cannot see anything logical (theory) or practical (experiment) that would invalidate this point. I, therefore, intend to write a basic book on quantum physics from what I hope would be Lorentz’ or Einstein’s point of view.

My detractors will immediately cry wolf: Einstein lost the discussions with Bohr, didn’t he? I do not think so: he just got tired of them. I want to try to pick up the story where he left it. Let’s see where I get. 🙂

Bell’s No-Go Theorem

I’ve been asked a couple of times: “What about Bell’s No-Go Theorem, which tells us there are no hidden variables that can explain quantum-mechanical interference in some kind of classical way?” My answer to that question is quite arrogant, because it’s the answer Albert Einstein would give when younger physicists would point out that his objections to quantum mechanics (which he usually expressed as some new  thought experiment) violated this or that axiom or theorem in quantum mechanics: “Das ist mir wur(sch)t.

In English: I don’t care. Einstein never lost the discussions with Heisenberg or Bohr: he just got tired of them. Like Einstein, I don’t care either – because Bell’s Theorem is what it is: a mathematical theorem. Hence, it respects the GIGO principle: garbage in, garbage out. In fact, John Stewart Bell himself – one of the third-generation physicists, we may say – had always hoped that some “radical conceptual renewal”[1] might disprove his conclusions. We should also remember Bell kept exploring alternative theories – including Bohm’s pilot wave theory, which is a hidden variables theory – until his death at a relatively young age. [J.S. Bell died from a cerebral hemorrhage in 1990 – the year he was nominated for the Nobel Prize in Physics. He was just 62 years old then.]

So I never really explored Bell’s Theorem. I was, therefore, very happy to get an email from Gerard van der Ham, who seems to have the necessary courage and perseverance to research this question in much more depth and, yes, relate it to a (local) realist interpretation of quantum mechanics. I actually still need to study his papers, and analyze the YouTube video he made (which looks much more professional than my videos), but this is promising.

To be frank, I got tired of all of these discussions – just like Einstein, I guess. The difference between realist interpretations of quantum mechanics and the Copenhagen dogmas is just a factor 2 or π in the formulas, and Richard Feynman famously said we should not care about such factors (Feynman’s Lectures, III-2-4). Modern physicists fudge them away consistently. They’ve done much worse than that, actually. :-/ They are not interested in truth. Convention, dogma, indoctrination – non-scientific historical stuff – seems to prevent them from that. And modern science gurus – the likes of Sean Carroll or Sabine Hossenfelder etc. – play the age-old game of being interesting: they pretend to know something you do not know or – if they don’t – that they are close to getting the answers. They are not. They have them already. They just don’t want to tell you that because, yes, it’s the end of physics.


[1] See: John Stewart Bell, Speakable and unspeakable in quantum mechanics, pp. 169–172, Cambridge University Press, 1987.

The End of Physics

There is an army of physicists out there – still – trying to convince you there is still some mystery that needs explaining. They are wrong: quantum-mechanical weirdness is weird, but it is not some mystery. We have a decent interpretation of what quantum-mechanical equations – such as Schrodinger’s equation, for example – actually mean. We can also understand what photons, electrons, or protons – light and matter – actually are, and such understanding can be expressed in terms of 3D space, time, force, and charge: elementary concepts that feel familiar to us. There is no mystery left.

Unfortunately, physicists have completely lost it: they have multiplied concepts and produced a confusing but utterly unconvincing picture of the essence of the Universe. They promoted weird mathematical concepts – the quark hypothesis is just one example among others – and gave them some kind of reality status. The Nobel Prize Committee then played the role of the Vatican by canonizing the newfound religion.

It is a sad state of affairs, because we are surrounded by too many lies already: the ads and political slogans that shout us in the face as soon as we log on to Facebook to see what our friends are up to, or to YouTube to watch something or – what I often do – listen to the healing sounds of music.

The language and vocabulary of physics are complete. Does it make us happier beings? It should, shouldn’t it? I am happy I understand. I find consciousness fascinating – self-consciousness even more – but not because I think it is rooted in mystery. No. Consciousness arises from the self-organization of matter: order arising from chaos. It is a most remarkable thing – and it happens at all levels: atoms in molecules, molecules forming cellular systems, cellular systems forming biological systems. We are a biological system which, in turn, is part of much larger systems: biological, ecological – material systems. There is no God talking to us. We are on our own, and we must make the best out of it. We have everything, and we know everything.

Sadly, most people do not realize.

Post scriptum: With the end of physics comes the end of technology as well, isn’t it? All of the advanced technologies in use today are effectively already described in Feynman’s Lectures on Physics, which were written and published in the first half of the 1960s.

I thought about possible counterexamples, like optical-fiber cables, or the equipment that is used in superconducting quantum computing, such as Josephson junctions. But Feynman already describes Josephson junctions in the last chapter of his Lectures on Quantum Mechanics, which is a seminar on superconductivity. And fiber-optic cable is, essentially, a waveguide for light, which Feynman describes in very much detail in Chapter 24 of his Lectures on Electromagnetism and Matter. Needless to say, computers were also already there, and Feynman’s lecture on semiconductors has all you need to know about modern-day computing equipment. [In case you briefly thought about lasers, the first laser was built in 1960, and Feynman’s lecture on masers describes lasers too.]

So it is all there. I was born in 1969, when Man first walked on the Moon. CERN and other spectacular research projects have since been established, but, when one is brutally honest, one has to admit these experiments have not added anything significant – neither to the knowledge nor to the technology base of humankind (and, yes, I know your first instinct is to disagree with that, but that is because study or the media indoctrinated you that way). It is a rather strange thought, but I think it is essentially correct. Most scientists, experts and commentators are trying to uphold a totally fake illusion of progress.

Explaining the proton mass and radius

Our alternative realist interpretation of quantum physics is pretty complete but one thing that has been puzzling us is the mass density of a proton: why is it so massive as compared to an electron? We simplified things by adding a factor in the Planck-Einstein relation. To be precise, we wrote it as E = 4·h·f. This allowed us to derive the proton radius from the ring current model:

proton radius This felt a bit artificial. Writing the Planck-Einstein relation using an integer multiple of h or ħ (E = n·h·f = n·ħ·ω) is not uncommon. You should have encountered this relation when studying the black-body problem, for example, and it is also commonly used in the context of Bohr orbitals of electrons. But why is n equal to 4 here? Why not 2, or 3, or 5 or some other integer? We do not know: all we know is that the proton is very different. A proton is, effectively, not the antimatter counterpart of an electron—a positron. While the proton is much smaller – 459 times smaller, to be precise – its mass is 1,836 times that of the electron. Note that we have the same 1/4 factor here because the mass and Compton radius are inversely proportional:

ratii

This doesn’t look all that bad but it feels artificial. In addition, our reasoning involved a unexplained difference – a mysterious but exact SQRT(2) factor, to be precise – between the theoretical and experimentally measured magnetic moment of a proton. In short, we assumed some form factor must explain both the extraordinary mass density as well as this SQRT(2) factor but we were not quite able to pin it down, exactly. A remark on a video on our YouTube channel inspired us to think some more – thank you for that, Andy! – and we think we may have the answer now.

We now think the mass – or energy – of a proton combines two oscillations: one is the Zitterbewegung oscillation of the pointlike charge (which is a circular oscillation in a plane) while the other is the oscillation of the plane itself. The illustration below is a bit horrendous (I am not so good at drawings) but might help you to get the point. The plane of the Zitterbewegung (the plane of the proton ring current, in other words) may oscillate itself between +90 and −90 degrees. If so, the effective magnetic moment will differ from the theoretical magnetic moment we calculated, and it will differ by that SQRT(2) factor.

Proton oscillation

Hence, we should rewrite our paper, but the logic remains the same: we just have a much better explanation now of why we should apply the energy equipartition theorem.

Mystery solved! 🙂

Post scriptum (9 August 2020): The solution is not as simple as you may imagine. When combining the idea of some other motion to the ring current, we must remember that the speed of light –  the presumed tangential speed of our pointlike charge – cannot change. Hence, the radius must become smaller. We also need to think about distinguishing two different frequencies, and things quickly become quite complicated.