Just delete “vacuum fluctuations”

How do you build a tower? One layer of bricks at a time. But before you lay down the next layer of bricks, you need to make sure the current layer of bricks has been laid down properly. Otherwise, the whole thing may be tumbling down.

The same is true in physics. Before, you base your ideas on previous ideas, you need to check that those previous ideas are correct. Otherwise, you would be misleading yourself and others, and the new theories may not be able to make successful predictions.

Physics is a science, which means that we should only trust previous ideas after they have been tested through comparison with physical observations. Unfortunately, there are some ideas that cannot be checked so easily. Obviously, one should then be very careful when you base new ideas on such unchecked ideas. Some people blame the current lack of progress in fundamental physics on this problem. They say we need to go back and check if we have not made a mistake somewhere. I think I know where this problem is.

Over the centuries of physics research, many tools have been developed to aid the formulation of theories. These tools include things like differential calculus in terms of which equations of motion can be formulated, and Hamiltonians and Lagrangians, to name a few.

Now, I see that some people claim that most of these tools won’t work for the formulation of a fundamental theory that includes gravity with quantum theory. It is stated that a minimum measurement uncertainty, imposed by the Planck scale, would render the formulation of equations of motion and Lagrangians at this scale impossible. Why is that? Well, it is claimed that the uncertainty at such small distance scales is large enough to allow tiny black holes to pop in and out of existence, creating havoc with spacetime at such small scales. This argument is the reason why people consider the Planck scale as a fundamental scale beneath which our traditional notions of physics and spacetime break down.

But why does uncertainty lead to black holes popping in and out of existence? It comes from an unchecked idea based on the Heisenberg uncertainty principle, which claims that it allows particles to pop in and out of existence, and such particles can have larger energies when the time for their existence is short enough. This hypothetical process is generally referred to as “vacuum fluctuations.” However, there does not exist any conclusive experimental confirmation of the process of vacuum fluctuations. Therefore, any idea based on vacuum fluctuations is an idea based on an unchecked idea.

Previously, I have explained that the Heisenberg uncertainty principle is not a fundamental principle of quantum physics, but instead comes from Fourier theory. As such the uncertainty principle represents a prohibition and not a license. It imposes restrictions on what can exist. Instead, people somehow decided that it allows things to exist in violation of other principles such as energy conservation. This is an erroneous notions with no experimental confirmation.

Hence, the vacuum does not fluctuate! There are no particles popping in and out of existence in the vacuum. There is nothing in our understanding of the physical world that has been experimentally confirmed which needs the concept of vacuum fluctuations.

Now, if we get rid of this notion of vacuum fluctuations, several issues in fundamental physics will simply disappear. For example, the black hole information paradox. A key ingredient of this paradox is the idea that black holes will evaporate due to Hawking radiation. The notion of Hawking radiation is another unchecked idea, which is based on …? You guessed it: vacuum fluctuations! So if we just get rid of this silly notion of vacuum fluctuations, the black hole information paradox will evaporate, instead of the black holes.

Guiding principles I: substructure

Usually the principles of physics are derived from successful scientific theories. For instance, Lorentz invariance which can be seen as the underlying principle on which special relativity is based, was originally derived from Maxwell’s equations. As we learn more about the universe and how it works, we discover more principles. These principles serve to constrain any new theories that we try to formulate to describe that which we don’t understand yet.

It turns out that the physics principles that we have uncovered so far, don’t seem to constrain theories enough. There are still vastly different ways to formulate new theories. So we need to do something that is very dangerous. We need to guess some additional physics principles that would guide us in the formulation of such new theories. Chances are that any random guess would send us down a random path in theory space with very little chance of being the right thing. An example is string theory, where the random guess was that the fundamental objects are strings. It has kept a vast number of researchers busy for decades without success.

Instead of making a random guess, we can try to see if our existing theories don’t perhaps already give us some additional hints at what such a guiding principle should be. So, I’ll share my thoughts on this for what it is worth. I’ll start with what our current theories tell us about substructure.

The notion of a substructure can already be identified in the work of Huygens, Fresnel, etc. on interference . It revealed that light is a wave. The physical quantity that is observed is the intensity, which is always positive. However, we need to break the intensity apart into amplitudes that can have negative values to allow destructive interference. In this very simple sense, the amplitude (which is often modeled as a complex valued function) serves as a substructure for that which is observed.


It is not a big leap from interference in classical light to come to the interference in quantum systems. Here the observation is interpreted as a probability, which is also a positive quantity. In quantum mechanics, the notion of a probability is given a substructure in the form of a probability amplitude which can be negative (or complex) to allow interference phenomena.

The concept of a substructure is today perhaps mostly associated with the notion of constituent particles. We know now that the proton is not a fundamental particle, but that it has a substructure consisting of fundamental particles called quarks, bound together via the strong force. Although it is not currently considered to be the case, these quarks may also have some substructure. However, the concept of this substructure may be different from the way it appears in protons.

A new idea that is emerging is the idea that spacetime itself may have a substructure. Ever since the advent of general relativity, we know that spacetime is affective by gravity. In our current formulation of particle physics, spacetime is the backdrop on which all the particle fields perform their dance. But when gravity is added, spacetime joins the dance. It makes the formulation of fundamental theories very complicated. The difference between the particles and spacetime becomes blurred. This leads to the idea that spacetime itself may have a substructure. In this way, it combines the two different ways to look at substructure. On the one hand it may be divided into two parts, perhaps to separate chirality, much in the way intensity separates into an amplitude and its complex conjugate. On the other hand the separation of spacetime may give some substructure to the particle fields, being described in terms of fluctuations in spacetime’s substructure.

Caution is necessary here. Even if these ideas turn out to be valid, they still leave much detail unspecified. It may not be enough to regard the idea of substructure as a physics principle. The importance it to keep to the standard practice in physics: mathematics is merely used to formulate and model the physics universe. It does not tell us something new about the universe unless this is somehow already logically encoded in what we start off with.

Perhaps an example would help to explain what I mean. Einstein formulated general relativity (GR) after he figured out the equivalence principle. So everything that we can learn from GR follows as inevitable logical consequences from this principle. It tells us that the mass-energy distribution curves spacetime, but it does not tell us how this happens. In other words, the mechanism by which mass curves spacetime is not known because it is not a logical consequence of the equivalence principle.

So, the idea is to come up with a general mathematical formalism that is powerful enough to model this kind of scenario without trying to dictate the physics. Remember, quantum field theory is a formalism in terms of which different models for the dynamics in particle physics can be modeled. It does not dictate the dynamics but allow anything to be modeled. Another example is differential geometry which allows the formulation of GR but does not dictate it. Part of the reason why string theory fails is because is a mathematical formulation that also dictates the dynamics. The formulation of a quantum theory for gravity requires a flexible formalism that does not dictate the dynamics.

Those quirky fermions

All of the matter in the universe is made of fermions. They are for this reason one of the most abundant things in the universe. Fermions have been the topic of investigation for a long time. We have learned much about them. However, what we do know about them is encapsulated in the formalisms with which we deal with them in our theories. Does that mean we understand them?

Let’s think about the way we treat fermions in our theories. Basically, we represent them in terms of creation and annihilation operators, which are used to formulate the interactions in which they take part. These operators are distinguished from those for bosons by the anti-commutation relations that they obey.

To the uninitiated, all this must sound like a bunch of gobbledygook. What are the physical manifestations of all these operators? There are none! These operators are just mathematical entities in the formalism for our theories. Although these theories are quite successful, it does not reveal the physical machinery at work on the inside. Or does it?

Although a creation operator does not by itself represent any physical process, it distinguishes different scenarios with different arrangements of fermions. Starting with a given scenario, I can apply a fermion creation operator to introduce a new scenario which contains one additional fermion. Then I can apply the operator again, provided that I am not trying to add another fermion with the same degrees of freedom, it will produce another new scenario.

Here is the strange thing. If I change the order in which I added the two additional fermions, I get a scenario that is different from the one with the previous order. I can contrast this to the situation with bosons. Provided that I don’t try to add bosons with the same degrees of freedom, the order in which I add them doesn’t matter. What it tells us is that bosons with different degrees of freedom don’t effect each other. (We need to be careful about the concepts of time-like or space-like separations, but for the sake of this argument, we’ll assume all bosons or fermions are space-like separated.)

The fact that the order in which we place fermions in our scenario (even when they are space-like separated) makes a difference tells us something physical about fermions. They must be global entities. The entire universe seems to “know” about the existence of each and every fermion in it.

How can that be possible? I can think of one way: topological defects. This is not a new idea. It pops up quite often in various fields of physics.

Topological defect

Why would a topological defect explain the apparent global nature of fermions? It is because all kinds of topological defects can be identified with the aid of an integral that computes the winding number of the topological defect. This type of integral is evaluated over a (hyper)surface that encloses the topological defect. In other words, the field values far away from the defect are included in the integral and not the field value at the defect. Therefore, knowledge about the defect in encoded in the entire field. It therefore suggest that fermions can behave as global entities if they topological defect. This is just a hypothesis. It needs more careful investigation.

Why not an uncertainty principle?

It may seem strange that there are no fundamental physical principle for quantum physics that is associated with uncertainty among those that I have proposed recently. What would be the reason? Since we refer to it as the Heisenberg uncertainty principle, it should qualify as one of the fundamental principles of quantum physics, right? Well, that is just it. Although it qualifies as being a principle, it is not fundamental.

It may help to consider how these concepts developed. At first, quantum mechanics was introduced as an improvement of classical mechanics. Therefore, quantities like position and momentum played an important role.

A prime example of a system in classical mechanics is the harmonic oscillator. Think of a metal ball hanging from a spring. Being pulled down and let go, the ball will start oscillating, moving periodically up and down. This behavior is classically described in terms of a Hamiltonian that contains the position and momentum of the metal ball.

In the quantum version of this system, the momentum is replaced by the wave vector times the Planck constant. But position and the wave vector are conjugate Fourier variables. That is the origin of the uncertainty. Moreover, it also leads to non-commutation when position and momentum are represented as operators. The sum and difference of these two operators behave as lowering and raising operators for quanta of energy in the system. The one reduces the energy in discrete steps and the other increases it in discrete steps.

It was then found that quantum mechanics can also be used to improve classical field theory. But there are several differences. Oscillations in fields are not represented as displacements in position that is exchange into momentum. Instead, their oscillations manifest in terms of the field strength. So, to develop a quantum theory of fields, one would start with the lowering and raising operators, which are now called creation and annihilation operators or ladder operators. Their sum and difference produce a pair of operators that are analogues to the position and momentum operators for the harmonic oscillator. In this context, these are called quadrature operators. They portray the same qualitative behavior as the momentum and position operators. They represent conjugate Fourier variables and therefore again produce an uncertainty and non-commutation. The full development of quantum field theory is far more involved then what I described here, but I only focused on the origin of the uncertainty in this context here.

So, in summary, uncertainty emerges as an inevitable consequence of the Fourier relationship between conjugate variables. In the case of mechanical systems, these conjugate variables come about because of the quantum relationship between momentum and wave vector. In the case of fields, these conjugate variables comes from the ladder operators, leading to analogues properties as found for the formal description of the harmonic oscillator. Hence, uncertainty is not a fundamental property in quantum physics.

Principles of quantum physics

Previously, I argued for principles rather than postulates. Usually, principles are added to a field of study only after some progress have been made with the theories in that field. However, sometimes these principles are required ahead of the time to make progress in a field. That may be the case in fundamental physics where such principles can be used as guiding principles. However, in the latter case such principles may be just guess-work. They may turn out to be wrong.

Quantum physics has been around for a long enough time to justify having its own set of principles. There are postulates for quantum mechanics, but as I explained, they are like a set of axioms for the mathematical formalism and therefore don’t qualify as principles. Principles are statements phrased in terms of physical concepts and not in terms of mathematical concepts.

Here, I want to propose such principles. They are a work in progress. Those that I can state are not extremely surprising. They shouldn’t be because quantum physics has been investigated in so many different ways. However, there are some subtleties that need special attention.

The first principle is simply a statement of Planck’s discovery: fundamental interactions are quantized. Note that it does not say that “fields” or “particles” are quantized, because we don’t know that. All we do know is what happens at interactions because all our observations involve interactions. Here, the word “quantized” implies that the interacting entities exchange quantized amounts of energy and momentum.

What are these interacting entities? Usually we would refer to them as particles, but that already makes an assumption about their existence. Whenever we make an observation that would suggest that there are particles, we actually see an interaction. So we cannot conclude that we saw a particle, but we can conclude that the interaction is localized. Unless there is some fundamental distance scale that sets a lower limit, the interaction is point-like – it happens at a dimensionless point. The most successful theories treat these entities as fields with point-like interactions. We can therefore add another principle: fundamental interactions are localized. However, we can combine it with the previous principle and see it as another side of one and the same principle: fundamental interactions are quantized and localized.

The next principle is a statement about the consequences of such interactions. However, it is so important that it needs to be stated as a separate principle. I am still struggling with the exact wording, so I’ll just call it the superposition principle. Now, superposition is something that already exists in classical field theory. In that case, the superposition entails the coherent additions of different fields. The generalization that is introduced by quantum physics is the fact that such superpositions can involved multiple entities. In other words, the superposition is the coherent addition of multiple fields. The notion of multiple entities is introduced due to the interactions. It allows a single entity to split up into multiple entities, each of which can carry a full compliment of all the degrees of freedom that can be associated with such an entity. However, due to conservation principles, the interaction sets up constraints on the relationship among the degrees of freedom of the different entities. As a result, the degrees of freedom of these entities are entangled, which manifests as a superposition of multiple entities.

Classical and quantum superpositions

We need another principle to deal with the complexities of fermionic entities, but here I am still very much in the dark. I do not want to refer to the anti-commuting nature of fermionic operators because that is a mathematical statement. Perhaps, it just shows how little we really know about fermions. We have a successful mathematical formulation, but still do not understand the physical implications of this formulation.