Discreteness

Demystifying quantum mechanics V

Perhaps one of the most iconic “mysteries” of quantum mechanics is the particle-wave duality. Basically, it comes down to the fact that the interference effects one can observe implies that quantum entities behave like waves, but at the same time, these entities are observed as discrete lumps, which are interpreted as particles. Previously, I explained that one can relax the idea of localized lumps a bit to allow only the interactions, which are required for observations, to be localized. So instead of particles, we can think of these entities as partites that share all the properties of particles, accept that they are not localized lumps. So, they can behave like waves and thus give rise to all the wave phenomena that are observed. In this way, the mystery of the particle-wave duality is removed.

Now, it is important to understand that, just like particles, partites are discrete entities. The discreteness of these entities is an important aspect that plays a significant role in the phenomena that we observe in quantum physics. Richard Feynman even considered the idea that “all things are made of atoms” to be the single most important bit of scientific knowledge that we have.

Model of the atom

How then does it happen that some physicist would claim that quantum mechanics is not about discreteness? In her blog post, Hossenfelder goes on to make a number of statements that contradict much of our understanding of fundamental physics. For instance, she would claim that “quantizing a theory does not mean you make it discrete.”

Let’s just clarify. What does it mean to quantize a theory? It depends, whether we are talking about quantum mechanics or quantum field theory. In quantum mechanics, the processing of quantizing a theory implies that we replace observable quantities with operators for these quantities. These operators don’t always commute with each other, which then leads to the Heisenberg uncertainty relation. So the discreteness is not immediately apparent. On the other hand, in quantum field theory, the quantization process implies that fields are replaced by field operators. These field operators are expressed in terms of so-called ladder operators: creation and annihilation operators. What a ladder operator does is to change the excitation of a field in discrete lumps. Therefore, discreteness is clearly apparent in quantum field theory.

What Hossenfelder says, is that the Heisenberg uncertainty relationships is the key foundation for quantum mechanics. In one of her comments, she states: “The uncertainty principle is a quantum phenomenon. It is not a property of classical waves. If there’s no hbar in it, it’s not the uncertainty principle. People get confused by the fact that waves obey a property that looks similar to the uncertainty principle, but in this case it’s for the position and wave-number, not momentum. That’s not a quantum phenomenon. That’s just a mathematical identity.”

It seems that she forgot about Louise de Broglie’s equation, which relates the wave-number to the momentum. In a previous post, I have explained that the Heisenberg uncertain relationship is an inevitable consequence of the Planck and de Broglie equations, which relate the conjugate variables of the phase space with Fourier variables. It has nothing to do with classical physics. It is founded in the underlying mathematics associated with Fourier analysis. Let’s not allow us to be mislead by people that are more interested in sensationalism than knowledge and understanding.

The discreteness of partites allows the creation of superpositions of arbitrary combinations of such partites. The consequences for such scenarios include quantum interference that is observed in for instance the Hong-Ou-Mandel effect. It can also lead to quantum entanglement, which is an important property used in quantum information systems. The discreteness in quantum physics therefore allows it to go beyond what one can find in classical physics.

This image has an empty alt attribute; its file name is 1C7DB1746CFC72286DF097344AF23BD2.png

Partiteness

Demystifying quantum mechanics IV

Yes I know, it is not a word, at least not yet. We tend to do that in physics sometimes. When one wants to introduce a new concept, one needs to give it a name. Often, that name would be a word that does not exist yet.

What does it mean? The word “partiteness” indicates the property of nature that it can be represented in terms of parties or partites. It is the intrinsic capability of a system to incorporate an arbitrary number of partites. In my previous post, I mentioned partites as a replacement for the notion of particles. The idea of partites is not new. People often consider quantum systems consisting of multiple partites.

What are these partites then? They represent an abstraction of the concept of a particle. Usually the concept is used rather vaguely, since it is not intended to carry more significance than what is necessary to describe the quantum system. I don’t think anybody has ever considered it to be a defining property that nature possesses at the fundamental level. However, I feel that we may need to consider the idea of partiteness more seriously.

Classical optics diffraction pattern

Let’s see if we can make the concept of a partite a little more precise. It is after all the key property that allows nature to transcend its classical nature. It is indeed an abstraction of the concept of a particle, retaining only those aspects of particles that we can confirm experimentally. Essentially, they can carry a full compliment of all the degrees of freedom associated with a certain type of particle. But, unlike particles, they are not dimensionless points traveling on world lines. In that sense, they are not localized. Usually, one can think of a single partite in the same way one would think of a single particle such as a photon, provided one does not think of it as a single point moving around in space. A single photon can have a wave function described by any complex function that satisfies the equations of motion. (See for instance the diffraction pattern in the figure above.) The same is true for a partite. As a result, a single partite behaves in the same way as a classical field. So, we can switch it around and say that a classical field represents just one partite.

The situation becomes more complicated with multiple partites. The wave function for such a system can become rather complex. It allows the possibility for quantum entanglement. We’ll postpone a better discussion of quantum entanglement for another time.

Multiple photons can behave in a coherent fashion so that they all essentially share the same state in terms of the degrees of freedom. All these photons can then be viewed collectively as just one partite. This situation is what a coherent classical optical field would represent. Once again we see that such a classical field behaves as just one partite.

The important difference between a particle and a partite is that the latter is not localized in the way a particle is localized. A partite is delocalized in a way that is described by its wave function. This wave function describes all the properties of the partite in terms of all the degrees of freedom associated with it, including the spatiotemporal degrees of freedom and the internal degrees of freedom such as spin.

The wave function must satisfy all the constraints imposed by the dynamics associated with the type of field. It includes interactions, either with itself (such as gluons in quantum chromodynamics) or with other types of fields (such as photons with charges particles).

All observations involve interactions of the field with whatever device is used for the observation. The notion of particles comes from the fact that these observations tend to be localized. However, on careful consideration, such a localization of an observation only tells us that the interactions are localized and not that the observed field must consist of localized particles. So, we will relax the idea that fields must be consisting of localized particle and only say that, for some reason that we perhaps don’t understand yet, the interaction among fields are localized. That leaves us free to consider the field as consisting of nonlocal partites (thus avoiding all sort of conceptual pitfalls such as the particle-wave duality).

Hopefully I have succeeded to convey the idea that I have in my mind of the concept of a partite. If not, please let me know. I would love to discuss it.

This image has an empty alt attribute; its file name is 1C7DB1746CFC72286DF097344AF23BD2.png

What particle?

Demystifying quantum mechanics III

The notion of a particle played an important role in our understanding of fundamental physics. It also lies at the core of understanding quantum mechanics. However, there are some issues with the notion of a particle that can complicate things. Before addressing the role that particles play in the understanding of quantum mechanics, we first need to look at these issues.

Particle trajectories detected in a high energy experiment

So what is this issue about particles? The problem is that we don’t really know whether there really are particles. What?!!! Perhaps you may think that what I’m referring to has something to do with the wave-particle duality. No, this issue about the actual existence of particles goes a little deeper than that.

It may seem like a nonsense issue, when one considers all the experimental observation of particles. The problem is that, while the idea of a particle provides a convenient explanation for what we see in those experiments, none of them actually confirms that what we see must be particles. Even when one obtains a trajectory as in a cloud chamber or in the more sophisticated particle detectors that are used in high energy particle experiments, such as the Large Hadron Collider, such a trajectory can be explained as a sequence of localized observations each of which projects the state onto a localize pointer state, thus forcing the state to remain localized through a kind of Zeno effect. It all this sounds a little too esoteric, don’t worry. The only point I’m trying to make is that the case for the existence of actual particles is far from being closed.

Just to be on the same page, let’s first agree what we mean when we talk about a particle. I think it was Eugene Wigner that defined a particle as a dimensionless point traveling on a world line. Such a particle would explain those observed trajectories, provided one allows for a limited resolution in the observation. However, this definition runs into problems with quantum mechanics.

Consider for example Young’s double slit experiment. Here the notion of a particle on a world line encounters a problem, because somehow the particle needs to pass through both slits to produce the interference pattern that is observed. This leads to the particle-wave duality. To solve this problem, one can introduce the idea of a superposition of trajectories. By itself this idea does not solve the problem, because these trajectories must produce an interference pattern. So one can add the notion (thanks to Richard Feynman) of a little clock that accompanies each of the trajectories, representing the evolution of the phase along the trajectory. Then when the particle arrives at the screen along these different trajectories the superposition together with the different phase values will determine the interference at that point.

Although the construction thus obtained can explain what is being seen, it remains a hypothesis. We run into the frustrating situation that nature does not allow us any means to determine whether this picture is correct. Every observation that we make just gives us the same localized interaction and there is no way to probe deeper to see what happens beyond that localize interaction.

So, we arrive at the situation where our scientific knowledge of the micro-world will always remain incomplete. We can build strange convoluted constructs to provide potential explanations, but we can never establish their veracity.

This situation may seem like a very depressing conclusion, but if we can accept that there are things we can never know, then we may develop a different approach to our understanding. It helps to realize that our ignorance exactly coincides with the irrelevance of the issue. In other words, that which we cannot know is precise that which would never be useful. This conclusion follows from the fact that, if it could have been useful, we would have had the means to study it and uncover a true understanding of it.

So, let’s introduce at a more pragmatic approach to our understanding of the micro-world. Instead of trying to describe the exact nature of the physical entities (such as particles) that we encounter, let’s rather focus on the properties of these entities that would produce the phenomena that we can observe. Instead of particles, we focus of the properties that make things look like particles. This brings us to the notion of a party or a partite.

But now the discussion is becoming too long. More about that next time.

This image has an empty alt attribute; its file name is 1C7DB1746CFC72286DF097344AF23BD2.png

Particle physics impasse

Physics is the study of the physical universe. As a science, it involves a process consisting of two components. The theoretical component strives to construct theoretical models for the physical phenomena that we observe. The experimental component tests these theoretical models and explores the physical world for more information about phenomena.

Progress in physics is enhanced when many physicists using different approaches tackle the same problem. The diversity in the nature of problems need to be confronted by a diversity of perspectives. This diversity is reflected in the literature. The same physical phenomenon is often studied by different approaches, using different mathematical formulations. Some of them may turn out to produce the same results, but some may differ in their predictions. The experimental work can then be used to make a selection among them.

That is all fine and dandy for physics in general, but the situation is a bit more complicated for particle physics. Perhaps, one can see the reason for all these complications as the fact that particle physics is running out of observable energy space.

What do I mean by that? Progress in particle physics is (to some extent at least) indicated by understanding the fundamental mechanisms of nature at progressively higher energy scales. Today, we understand these fundamental mechanisms to a fairly good degree up to the electroweak scale (at about 200 GeV). It is described by the Standard Model, which was established during the 1970’s. So, for the past 4 decades, particle physicists tried to extend the understand beyond that scale. Various theoretical ideas were proposed, prominent among these were the idea of supersymmetry. Then a big experiment, the Large Hadron Collider (LHC) was constructed to test these ideas above the electroweak scale. It discovered the Higgs boson, which was the last extent particle predicted by the standard model. But no supersymmetry. In fact, none of the other ideas panned out at all. So there is a serious back-to-the-drawing-board situation going on in particle physics.

The problem is, the LHC did not discover anything else that could give a hint at what is going on up there, or did it? There will be another run to accumulate more data. The data still needs to be analyzed. Perhaps something can still emerge. Who knows? However, even if some new particle is lurking within the data, it becomes difficult to see. Such particles tend to be more unstable at those higher energies, leading to very broad peaks. To make things worse, there is so much more background noise. This makes it difficult, even unlikely, that such particles can be identified at these higher energies. At some point, no experiment would be able to observe such particles anymore.

The interesting things about the situation is the backlash that one reads about in the media. The particle physicists are arguing among themselves about the reason for the current situation and what the way forward should be. There are those that say that the proposed models were all a bunch of harebrained ideas that were then hyped and that we should not build any new colliders until we have done some proper theoretical work first.

See, the problem with building new colliders is the cost involved. It is not like other fields of physics where the local funding organization can support several experimental groups. These colliders require several countries to pitch in to cover the cost. (OK, particle physics is not the only field with such big ticket experiments.)

The combined effect of the unlikeness to observe new particles at higher energies and the cost involved to build new colliders at higher energies, creates an impasse in particle physics. Although they may come up with marvelous new theories for the mechanisms above the electroweak scale, it may be impossible to see whether these theories are correct. Perhaps the last energy scale below which we will be able to understand the fundamental mechanisms in a scientific manner, will turn out to be the electroweak scale.

Glad I did not stay in particle physics.

Neutrino dust

It is the current understanding that the universe came into being in a hot big bang event. All matter initially existed as a very hot “soup” (or plasma) of charged particles – protons and electrons. The neutral atom (mostly hydrogen) only appeared after the soup cooled off a bit. At that point, the light that was produced by the thermal radiation of the hot matter had a chance to escape being directly re-absorbed.

Much of that light is still around today. We call it the microwave background radiation, because today that light has turned into microwave radiation as a result of being extremely Doppler-shifted toward low frequencies. The extreme Doppler-shift is caused by the expansion of the universe that happened since the origin of the microwave background radiation.

It is reasonable to assume that the very energetic conditions that existed during the big bang would have caused some of the hydrogen nuclei (protons) to combine in a fusion process to form helium nuclei. At the same time, some of the protons are converted to neutrons. The weak interaction mediates this process and it produces a neutrino, the lightest matter particle (fermion) that we know of.

So what happened to all these neutrinos? They were emitted at the same time or even before the light that caused the microwave background radiation. Since neutrinos are so light, their velocities are close to that of the speed of light. While expansion of the universe causes the light to be red-shifted, it also causes the neutrinos, which have a small mass to be slowed down. (Light never slows down, it always propagates at the speed of light.) Eventually these neutrinos are so slow that they are effectively stationary with respect to the local region in space. At this point they become dust, drifting along aimlessly in space.

While, since they do have mass, the neutrinos will be attracted by massive objects like the galaxies. So, the moment their velocities fall below the escape velocity of a nearby galaxy, they will become gravitationally bound to that galaxy. However, since they do not interact very strongly with matter, they will keep on orbiting these galaxies. So the neutrino dust will become clouds of dust in the vicinity of galaxies.

Hubble Space Telescope observes diffuse starlight in Galaxy Cluster Abell S1063NASAESA, and M. Montes (University of New South Wales)

Could the neutrino dust be the dark matter that we are looking for? Due to their small mass and the ratio of protons to neutrons in the universe, it is unlikely that there would be enough neutrinos to account for the missing mass attributed to dark matter. The ordinary neutrino dust would contribute to the effect of dark matter, but may not solve the whole problem.

There are some speculations that the three neutrinos may not be the only neutrinos that exist. Some theories also consider the possibility that an additional sterile neutrino exists. These sterile neutrinos could have large masses. For this reason, they have been considered as candidates for the dark matter. How these heavy neutrinos would have been produced is not clear, but, if they were produced during the big bang, they would also have undergone the same slow-down and eventually be converted into dust. So, it could be that there are a lot of them drifting around aimlessly through space.

Interesting, don’t you think?

Particle physics blues

The Large Hadron Collider (LHC) recently completed its second run. While the existence of the Higgs boson was confirmed during the first run, the outcome from the second run was … well, shall we say somewhat less than spectacular. In view of the fact that the LHC carries a pretty hefty price tag, this rather disappointing state of affairs is producing a certain degree of soul searching within the particle physics community. One can see that from the discussions here and here.

CMS detector at LHC (from wikipedia)

So what went wrong? Judging from the discussions, one may guess it could be a combination of things. Perhaps it is all the hype that accompanies some of the outlandish particle physics predictions. Or perhaps it is the overly esoteric theoretical nature of some of the physics theories. String theory seems to be singled out as an example of a mathematical theory without any practical predictions.

Perhaps the reason for the current state of affairs in particle physics is none of the above. Reading the above-mentioned discussions, one gets the picture from those that are close to the fire. Sometimes it helps to step away and look at the situation from a little distance. Could it be that, while these particle physicists vehemently analyze all the wrong ideas and failed approaches that emerged over the past few decades (even starting to question one of the foundations of the scientific method: falsifiability), they are missing the elephant in the room?

The field of particle physics has been around for a while. It has a long history of advances: from uncovering the structure of the atom, to revealing the constituents of protons and neutrons. The culmination is the Standard Model of Particle Physics – a truly remarkable edifice of current understand.

So what now? What’s next? Well, the standard model does not include gravity. So there is still a strong activity to come up with a theory that would include gravity with the other forces currently included in the standard model. It is the main motivation behind string theory. There’s another issue. The standard model lacks something called naturalness. The main motivation for the LHC was to address this problem. Unfortunately, the LHC has not been able to solve the issue and it seems unlikely that it, or any other collider, ever will. Perhaps that alludes to the real issue.

Could it be that particle physics has reached the stage where the questions that need answers cannot be answered through experiments anymore? The energy scales where the answers to these questions would be observable are just too high. If this is indeed the case, it would mark the end of particle physics as we know it. It would enter a stage of unverifiable philosophy. One may be able to construct beautiful mathematical theories to address the remaining questions. But one would never know whether these theories are correct.

What then?