Just delete “vacuum fluctuations”

How do you build a tower? One layer of bricks at a time. But before you lay down the next layer of bricks, you need to make sure the current layer of bricks has been laid down properly. Otherwise, the whole thing may be tumbling down.

The same is true in physics. Before, you base your ideas on previous ideas, you need to check that those previous ideas are correct. Otherwise, you would be misleading yourself and others, and the new theories may not be able to make successful predictions.

Physics is a science, which means that we should only trust previous ideas after they have been tested through comparison with physical observations. Unfortunately, there are some ideas that cannot be checked so easily. Obviously, one should then be very careful when you base new ideas on such unchecked ideas. Some people blame the current lack of progress in fundamental physics on this problem. They say we need to go back and check if we have not made a mistake somewhere. I think I know where this problem is.

Over the centuries of physics research, many tools have been developed to aid the formulation of theories. These tools include things like differential calculus in terms of which equations of motion can be formulated, and Hamiltonians and Lagrangians, to name a few.

Now, I see that some people claim that most of these tools won’t work for the formulation of a fundamental theory that includes gravity with quantum theory. It is stated that a minimum measurement uncertainty, imposed by the Planck scale, would render the formulation of equations of motion and Lagrangians at this scale impossible. Why is that? Well, it is claimed that the uncertainty at such small distance scales is large enough to allow tiny black holes to pop in and out of existence, creating havoc with spacetime at such small scales. This argument is the reason why people consider the Planck scale as a fundamental scale beneath which our traditional notions of physics and spacetime break down.

But why does uncertainty lead to black holes popping in and out of existence? It comes from an unchecked idea based on the Heisenberg uncertainty principle, which claims that it allows particles to pop in and out of existence, and such particles can have larger energies when the time for their existence is short enough. This hypothetical process is generally referred to as “vacuum fluctuations.” However, there does not exist any conclusive experimental confirmation of the process of vacuum fluctuations. Therefore, any idea based on vacuum fluctuations is an idea based on an unchecked idea.

Previously, I have explained that the Heisenberg uncertainty principle is not a fundamental principle of quantum physics, but instead comes from Fourier theory. As such the uncertainty principle represents a prohibition and not a license. It imposes restrictions on what can exist. Instead, people somehow decided that it allows things to exist in violation of other principles such as energy conservation. This is an erroneous notions with no experimental confirmation.

Hence, the vacuum does not fluctuate! There are no particles popping in and out of existence in the vacuum. There is nothing in our understanding of the physical world that has been experimentally confirmed which needs the concept of vacuum fluctuations.

Now, if we get rid of this notion of vacuum fluctuations, several issues in fundamental physics will simply disappear. For example, the black hole information paradox. A key ingredient of this paradox is the idea that black holes will evaporate due to Hawking radiation. The notion of Hawking radiation is another unchecked idea, which is based on …? You guessed it: vacuum fluctuations! So if we just get rid of this silly notion of vacuum fluctuations, the black hole information paradox will evaporate, instead of the black holes.

The deceptive lure of a final theory

There has been this nagging feeling that something is not quite right with the current flavor of fundamental physics theories. I’m not just talking about string theory. All the attempts that are currently being pursued share this salient property, which, until recently, I could not quite put my figure on. One thing that is quite obvious is that the level of mathematics that they entail are of a extremely sophisticated nature. That in itself is not quite where the problem lies, although it does have something to do with it.

Then, recently I looked at a 48 page write-up of somebody’s ideas concerning a fundamental theory to unify gravity and quantum physics. (It identifies the need for the “analytic continuation of spinors” and I thought it may be related to something that I’ve worked on recently.) It was while I read through the introductory parts of this manuscript that it struck me what the problem is.

If we take the standard model of particle physics as a case in point. It is a collection of theories (quantum chromodynamics or QCD, and the electro-weak theory) formulated in the language of quantum field theory. So, there is a separation between the formalism (quantum field theory) and the physics (QCD, etc.). The formalism was originally developed for quantum electro-dynamics. It contains some physics principles that have previous been established as scientific principles. In other words, those principles which are regarded as established scientific knowledge are built into the formalism. The speculative parts are all the models that can be modeled in terms of the formalism. They are not cast in stone, but the formalism is powerful enough to allow different models. Eventually some of these models passed various experimental tests and thus became established theories, which we now call the standard model.

What the formalism of quantum field theory does not allow is the incorporation of general relativity or some equivalent that would allow us to formulate models for quantum theories of gravity. So it is natural to think that what fundamental physicists should be spending their efforts on, would be an even more powerful formalism that would allow model building that addresses the question of gravity. However, when you take a critical look at the theoretical attempts that are currently being worked on, then we see that this is not the case. Instead, the models and the formalisms are the same thing. The established scientific knowledge and the speculative stuff are mixed together in highly complex mathematical theories. Does such an approach have any hope of success?

Why do people do that? I think it is because they are aiming high. They have the hope that what they come up with will be the last word in fundamental physics. It is the ambitious dream of a final theory. They don’t want to be bothering with models that are built on some general formalism in terms of which one can formulate various different models, and which may eventually be referred to as “the standard model.” That is just too modest.

Another reason is the view that seems to exist among those working on fundamental physics that nature dictates the mathematics that needs to be used to model it. In other words, they seem to think that the correct theory can only have one possible mathematical formalism. If that were true the chances that we have already invented that formalism or that we may by chance select the correct approach is extremely small.

But can it work? I don’t think there is any reasonable chance that some random venture into theory space could miraculously turn out to be the right guess. Theory space is just too big. In the manuscript I read, one can see that the author makes various ad hoc decisions in terms of the mathematical modeling. Some of these guesses seem to produce familiar aspects that resemble something about the physical world as we understand it, which them gives some indication that it is the “right path” to follow. However, mathematics is an extremely versatile and diverse language. One can easily be mislead by something that looked like the “right path” at some point. String theory is an excellent example in this regard.

So what would be a better approach? We need a powerful formalism in terms of which we can formulate various different quantum theories that incorporate gravity. The formalism can have, incorporate into it, as much of the established scientific principles as possible. That will make it easier to present models that already satisfy those principles. The speculations are then left for the modeling part.

The benefit of such an approach is that it unifies the different attempts in that such a common formalism makes it easier to use ideas from other attempts that seemed to have worked. In this way, the community of fundamental physics can work together to make progress. Hopefully the theories thus formulated will be able to make predictions that can be tested with physical experiments or perhaps astronomical observations that would allow such theories to become scientific theories. Chances are that a successful theory that incorporates gravity and at the same time covers all of particle physics as we understand it today will still not be the “final theory.” It may still be just a “standard model.” But it will represent progress in understanding which is more than what we can say for what is currently going on in fundamental physics.

Guiding principles I: substructure

Usually the principles of physics are derived from successful scientific theories. For instance, Lorentz invariance which can be seen as the underlying principle on which special relativity is based, was originally derived from Maxwell’s equations. As we learn more about the universe and how it works, we discover more principles. These principles serve to constrain any new theories that we try to formulate to describe that which we don’t understand yet.

It turns out that the physics principles that we have uncovered so far, don’t seem to constrain theories enough. There are still vastly different ways to formulate new theories. So we need to do something that is very dangerous. We need to guess some additional physics principles that would guide us in the formulation of such new theories. Chances are that any random guess would send us down a random path in theory space with very little chance of being the right thing. An example is string theory, where the random guess was that the fundamental objects are strings. It has kept a vast number of researchers busy for decades without success.

Instead of making a random guess, we can try to see if our existing theories don’t perhaps already give us some additional hints at what such a guiding principle should be. So, I’ll share my thoughts on this for what it is worth. I’ll start with what our current theories tell us about substructure.

The notion of a substructure can already be identified in the work of Huygens, Fresnel, etc. on interference . It revealed that light is a wave. The physical quantity that is observed is the intensity, which is always positive. However, we need to break the intensity apart into amplitudes that can have negative values to allow destructive interference. In this very simple sense, the amplitude (which is often modeled as a complex valued function) serves as a substructure for that which is observed.

Interference

It is not a big leap from interference in classical light to come to the interference in quantum systems. Here the observation is interpreted as a probability, which is also a positive quantity. In quantum mechanics, the notion of a probability is given a substructure in the form of a probability amplitude which can be negative (or complex) to allow interference phenomena.

The concept of a substructure is today perhaps mostly associated with the notion of constituent particles. We know now that the proton is not a fundamental particle, but that it has a substructure consisting of fundamental particles called quarks, bound together via the strong force. Although it is not currently considered to be the case, these quarks may also have some substructure. However, the concept of this substructure may be different from the way it appears in protons.

A new idea that is emerging is the idea that spacetime itself may have a substructure. Ever since the advent of general relativity, we know that spacetime is affective by gravity. In our current formulation of particle physics, spacetime is the backdrop on which all the particle fields perform their dance. But when gravity is added, spacetime joins the dance. It makes the formulation of fundamental theories very complicated. The difference between the particles and spacetime becomes blurred. This leads to the idea that spacetime itself may have a substructure. In this way, it combines the two different ways to look at substructure. On the one hand it may be divided into two parts, perhaps to separate chirality, much in the way intensity separates into an amplitude and its complex conjugate. On the other hand the separation of spacetime may give some substructure to the particle fields, being described in terms of fluctuations in spacetime’s substructure.

Caution is necessary here. Even if these ideas turn out to be valid, they still leave much detail unspecified. It may not be enough to regard the idea of substructure as a physics principle. The importance it to keep to the standard practice in physics: mathematics is merely used to formulate and model the physics universe. It does not tell us something new about the universe unless this is somehow already logically encoded in what we start off with.

Perhaps an example would help to explain what I mean. Einstein formulated general relativity (GR) after he figured out the equivalence principle. So everything that we can learn from GR follows as inevitable logical consequences from this principle. It tells us that the mass-energy distribution curves spacetime, but it does not tell us how this happens. In other words, the mechanism by which mass curves spacetime is not known because it is not a logical consequence of the equivalence principle.

So, the idea is to come up with a general mathematical formalism that is powerful enough to model this kind of scenario without trying to dictate the physics. Remember, quantum field theory is a formalism in terms of which different models for the dynamics in particle physics can be modeled. It does not dictate the dynamics but allow anything to be modeled. Another example is differential geometry which allows the formulation of GR but does not dictate it. Part of the reason why string theory fails is because is a mathematical formulation that also dictates the dynamics. The formulation of a quantum theory for gravity requires a flexible formalism that does not dictate the dynamics.

In defense of particle physics experiments

As a theorist, I may have misled some people into thinking that I don’t care much for experimental work. In particle physics, there tend to be a clear separation between theorists and experimentalists, with the phenomenologists sitting in between. Other fields in physics don’t have such sharp separations. However, most physicists lean toward one of the two.

Physics is a science. As such, it follows the scientific method. That implies that both theory and experiment are important. In fact, they are absolutely essential!

There are people that advocate, not only the suspension of experimental work in particle physics, but even that the methodology in particle physics be changed. What methodology in particle physics needs to be changed? Hopefully not anything related to the scientific method! To maintain the scientific method in particle physics, people need to keep on doing particle physics experiments.

CMS detector at LHC

There was a time when I also thought that the extreme expense in doing particle physics experiments was not justified by the results obtained from the Large Hadron Collider (LHC). However, as somebody explained, the results of the LHC are not so insignificant. If you think about it, the “lack of results” is a fallout of the bad theories that the theorists came up with. So by stopping the experimental work due to the “lack of results,” you would be punishing the experimentalists for the bad work of the theorists. More importantly, the experimentalists are just doing precisely what they should be doing in support of the scientific method: ruling out the nonsense theories that the theorists came up with. I think they’ve done more than just that. Hopefully, the theorists will do better in future, so that the experimentalists can have more positive results in future.

I should also mention the experimental work that is currently being done on neutrinos. It is a part of particle physics that we still do not understand well. These results may open the door for significant improvements in our theoretical understanding of particle physics.

So, please keep on doing experimental work in particle physics. If there is an methodological changes needed in particle physics, then that is limited to the way theorists are doing their work.

A post mortem for string theory

So string theory is dead. But why? What went wrong causing its demise? Or more importantly, why did it not succeed?

We don’t remember those theories that did not succeed. Perhaps we remember those that were around for a long time before they were shown to be wrong, like Newton’s corpuscular theory of light or Ptolemy’s epicycles. Some theories that unsuccessfully tried to explain things that we still don’t understand are also still remembered, like the different models for grand unification. But all those different models that people proposed for the electro-weak theory are gone. We only remember the successful one which is now part of the standard model.

Feynman said at some point that he does not like to read the literature on theories that could not explain something successfully, because it may mislead him. However, I think we can learn something generic about how to approach challenges in our fundamental understanding by looking at the the unsuccessful attempts. It is important not to be deceived by the seductive ideas of such failed attempts, but to scrutinize it for its flaws and learn from that.

Previously, I have emphasized the importance of a guiding principle for our endeavors to understand the fundamental aspects of our universe. I believe that one of the reasons why sting theory failed is because it has a flawed guiding principle. It is based on the idea that, instead of particles, the universe is made up of strings. Since strings are extended objects with a certain scale (the Planck scale), they provide a natural cut-off, removing those pesky infinities.

The problem is, when you invent something to replace something else, it begs the question that there is something to be replaced. In other words, did we need particles in the first place? The answer is no. Quantum field theory, which is the formalism in terms of which the successful standard model is formulated does not impose the existence of particles. It merely requires localized interactions.

But what about the justification for extended objects based on getting rid of the infinities? I’ve written about these infinities before and explained that they are to be expected in any realistic formulation of fundamental physics and that some contrivance to get rid of them does not make sense.

So, the demise of a theory based on a flawed guiding principle is not surprising. What we learn from this post mortem is that it is important to be very careful when we impose guiding principles. Although such principles are not scientifically testable, the notions on which we base such principles should be.