Guiding principle: quantum gravity

One of the aims of fundamental physics is to obtain a theory that can combine gravity with quantum physics. As I mentioned before, theory space is vast. A successful venture into theory space needs a reliable guiding principle. Without any experimental result pointing out the direction we need to take, the selection of such a guiding principle for the formulation of a quantum theory of gravity is difficult.

Some people believe that quantum gravity is the domain of the Planck scale where quantum and gravitational effects coincide. It requires extremely high (experimentally unattainable) energy densities. It also assumes that such high energy densities allow things like black holes and worm holes to pop in and out of existence. That is however an unscientific notion. Things don’t just pop in and out of existence, least of them black holes, regardless of the energy density.

Moreover, there are no such things as worm holes. I don’t care that Einstein thought they may exist. The idea represents one of those cases where the mathematics is over extended to produce a spurious solution that, although allowed mathematically, has no physical meaning. So they cannot pop in and out of existence anyway.

Hence, it is unlikely that there is anything interesting happening at the energy scale represented by the Planck scale, or more accurately called the hypothetical Planck scale. Therefore, I would not recommend any statements about what happens at this hypothetical Planck scale as a reliable guiding principle for quantum gravity.

As a more reliable guiding principle, we need to address the question, what happens to the gravitational field produced by a quantum state? What I mean by a quantum state is a state of matter in which quantum effects are manifest. An example of such a quantum effect is entanglement. So the question in this case is, does the gravitation field become entangled with the quantum state, or is the gravitational field uniquely produced by some combination of the elements in the superposition that represents the entangled state?

We can address the question with our current theory of general relativity. In Einstein’s field equation for general relativity, the curvature tensor of spacetime is equated to the stress-energy tensor of the matter distribution. In the context of quantum theory, the latter becomes an observable – an operator that can be traced with the quantum state to produce the observed stress-energy tensor of the quantum state. Obviously, the observed stress-energy tensor does not represent the entanglement anymore. Therefore, the curvature of spacetime produced by such an entangled state is affected by a combination of the elements in the superposition and does not become entangled with the state.

What does this say about the guiding principle for quantum gravity? What it seems to say is that there is no need for quantum gravity. The spacetime that we live in is a background in which the intricacies of quantum physics play out without becoming involved. The only effect that the quantum state of matter has on the gravitational field is through a unique stress-energy distribution for the entire state.

This conclusion is based on the assumption that Einstein’s field equation is valid on the small scale of quantum physics. It has been tested at larger scale and so far no deviations have been found. Without any observed deviations, there is not strong motivation for expecting that it would not be valid at the scales of quantum physics.

However, there is one aspect that Einstein’s field equation does not explain. It shows the connection between the curvature of spacetime and the distribution of matter, but it does not explain how mass-energy curves spacetime. It does not give a mechanism for this process. Such a mechanism may be hiding in the quantum description of matter. If such a mechanism can be uncovered, it would lead to a more comprehensive theory that would “explain” the Einstein’s field equation.

The search for this mechanism may be somewhat different from a search for a theory of quantum gravity. However, it can be seen as a more focussed attempt at formulating a theory of quantum gravity. To find this mechanism, we can perhaps focus of fermions. I think there are still some mysteries associated with fermions that need to be uncovered. Perhaps that can lead us to an understand of the mechanism for the way that mass-energy curves spacetime.

Just delete “vacuum fluctuations”

How do you build a tower? One layer of bricks at a time. But before you lay down the next layer of bricks, you need to make sure the current layer of bricks has been laid down properly. Otherwise, the whole thing may be tumbling down.

The same is true in physics. Before, you base your ideas on previous ideas, you need to check that those previous ideas are correct. Otherwise, you would be misleading yourself and others, and the new theories may not be able to make successful predictions.

Physics is a science, which means that we should only trust previous ideas after they have been tested through comparison with physical observations. Unfortunately, there are some ideas that cannot be checked so easily. Obviously, one should then be very careful when you base new ideas on such unchecked ideas. Some people blame the current lack of progress in fundamental physics on this problem. They say we need to go back and check if we have not made a mistake somewhere. I think I know where this problem is.

Over the centuries of physics research, many tools have been developed to aid the formulation of theories. These tools include things like differential calculus in terms of which equations of motion can be formulated, and Hamiltonians and Lagrangians, to name a few.

Now, I see that some people claim that most of these tools won’t work for the formulation of a fundamental theory that includes gravity with quantum theory. It is stated that a minimum measurement uncertainty, imposed by the Planck scale, would render the formulation of equations of motion and Lagrangians at this scale impossible. Why is that? Well, it is claimed that the uncertainty at such small distance scales is large enough to allow tiny black holes to pop in and out of existence, creating havoc with spacetime at such small scales. This argument is the reason why people consider the Planck scale as a fundamental scale beneath which our traditional notions of physics and spacetime break down.

But why does uncertainty lead to black holes popping in and out of existence? It comes from an unchecked idea based on the Heisenberg uncertainty principle, which claims that it allows particles to pop in and out of existence, and such particles can have larger energies when the time for their existence is short enough. This hypothetical process is generally referred to as “vacuum fluctuations.” However, there does not exist any conclusive experimental confirmation of the process of vacuum fluctuations. Therefore, any idea based on vacuum fluctuations is an idea based on an unchecked idea.

Previously, I have explained that the Heisenberg uncertainty principle is not a fundamental principle of quantum physics, but instead comes from Fourier theory. As such the uncertainty principle represents a prohibition and not a license. It imposes restrictions on what can exist. Instead, people somehow decided that it allows things to exist in violation of other principles such as energy conservation. This is an erroneous notions with no experimental confirmation.

Hence, the vacuum does not fluctuate! There are no particles popping in and out of existence in the vacuum. There is nothing in our understanding of the physical world that has been experimentally confirmed which needs the concept of vacuum fluctuations.

Now, if we get rid of this notion of vacuum fluctuations, several issues in fundamental physics will simply disappear. For example, the black hole information paradox. A key ingredient of this paradox is the idea that black holes will evaporate due to Hawking radiation. The notion of Hawking radiation is another unchecked idea, which is based on …? You guessed it: vacuum fluctuations! So if we just get rid of this silly notion of vacuum fluctuations, the black hole information paradox will evaporate, instead of the black holes.

Guiding principles I: substructure

Usually the principles of physics are derived from successful scientific theories. For instance, Lorentz invariance which can be seen as the underlying principle on which special relativity is based, was originally derived from Maxwell’s equations. As we learn more about the universe and how it works, we discover more principles. These principles serve to constrain any new theories that we try to formulate to describe that which we don’t understand yet.

It turns out that the physics principles that we have uncovered so far, don’t seem to constrain theories enough. There are still vastly different ways to formulate new theories. So we need to do something that is very dangerous. We need to guess some additional physics principles that would guide us in the formulation of such new theories. Chances are that any random guess would send us down a random path in theory space with very little chance of being the right thing. An example is string theory, where the random guess was that the fundamental objects are strings. It has kept a vast number of researchers busy for decades without success.

Instead of making a random guess, we can try to see if our existing theories don’t perhaps already give us some additional hints at what such a guiding principle should be. So, I’ll share my thoughts on this for what it is worth. I’ll start with what our current theories tell us about substructure.

The notion of a substructure can already be identified in the work of Huygens, Fresnel, etc. on interference . It revealed that light is a wave. The physical quantity that is observed is the intensity, which is always positive. However, we need to break the intensity apart into amplitudes that can have negative values to allow destructive interference. In this very simple sense, the amplitude (which is often modeled as a complex valued function) serves as a substructure for that which is observed.

Interference

It is not a big leap from interference in classical light to come to the interference in quantum systems. Here the observation is interpreted as a probability, which is also a positive quantity. In quantum mechanics, the notion of a probability is given a substructure in the form of a probability amplitude which can be negative (or complex) to allow interference phenomena.

The concept of a substructure is today perhaps mostly associated with the notion of constituent particles. We know now that the proton is not a fundamental particle, but that it has a substructure consisting of fundamental particles called quarks, bound together via the strong force. Although it is not currently considered to be the case, these quarks may also have some substructure. However, the concept of this substructure may be different from the way it appears in protons.

A new idea that is emerging is the idea that spacetime itself may have a substructure. Ever since the advent of general relativity, we know that spacetime is affective by gravity. In our current formulation of particle physics, spacetime is the backdrop on which all the particle fields perform their dance. But when gravity is added, spacetime joins the dance. It makes the formulation of fundamental theories very complicated. The difference between the particles and spacetime becomes blurred. This leads to the idea that spacetime itself may have a substructure. In this way, it combines the two different ways to look at substructure. On the one hand it may be divided into two parts, perhaps to separate chirality, much in the way intensity separates into an amplitude and its complex conjugate. On the other hand the separation of spacetime may give some substructure to the particle fields, being described in terms of fluctuations in spacetime’s substructure.

Caution is necessary here. Even if these ideas turn out to be valid, they still leave much detail unspecified. It may not be enough to regard the idea of substructure as a physics principle. The importance it to keep to the standard practice in physics: mathematics is merely used to formulate and model the physics universe. It does not tell us something new about the universe unless this is somehow already logically encoded in what we start off with.

Perhaps an example would help to explain what I mean. Einstein formulated general relativity (GR) after he figured out the equivalence principle. So everything that we can learn from GR follows as inevitable logical consequences from this principle. It tells us that the mass-energy distribution curves spacetime, but it does not tell us how this happens. In other words, the mechanism by which mass curves spacetime is not known because it is not a logical consequence of the equivalence principle.

So, the idea is to come up with a general mathematical formalism that is powerful enough to model this kind of scenario without trying to dictate the physics. Remember, quantum field theory is a formalism in terms of which different models for the dynamics in particle physics can be modeled. It does not dictate the dynamics but allow anything to be modeled. Another example is differential geometry which allows the formulation of GR but does not dictate it. Part of the reason why string theory fails is because is a mathematical formulation that also dictates the dynamics. The formulation of a quantum theory for gravity requires a flexible formalism that does not dictate the dynamics.

Those quirky fermions

All of the matter in the universe is made of fermions. They are for this reason one of the most abundant things in the universe. Fermions have been the topic of investigation for a long time. We have learned much about them. However, what we do know about them is encapsulated in the formalisms with which we deal with them in our theories. Does that mean we understand them?

Let’s think about the way we treat fermions in our theories. Basically, we represent them in terms of creation and annihilation operators, which are used to formulate the interactions in which they take part. These operators are distinguished from those for bosons by the anti-commutation relations that they obey.

To the uninitiated, all this must sound like a bunch of gobbledygook. What are the physical manifestations of all these operators? There are none! These operators are just mathematical entities in the formalism for our theories. Although these theories are quite successful, it does not reveal the physical machinery at work on the inside. Or does it?

Although a creation operator does not by itself represent any physical process, it distinguishes different scenarios with different arrangements of fermions. Starting with a given scenario, I can apply a fermion creation operator to introduce a new scenario which contains one additional fermion. Then I can apply the operator again, provided that I am not trying to add another fermion with the same degrees of freedom, it will produce another new scenario.

Here is the strange thing. If I change the order in which I added the two additional fermions, I get a scenario that is different from the one with the previous order. I can contrast this to the situation with bosons. Provided that I don’t try to add bosons with the same degrees of freedom, the order in which I add them doesn’t matter. What it tells us is that bosons with different degrees of freedom don’t effect each other. (We need to be careful about the concepts of time-like or space-like separations, but for the sake of this argument, we’ll assume all bosons or fermions are space-like separated.)

The fact that the order in which we place fermions in our scenario (even when they are space-like separated) makes a difference tells us something physical about fermions. They must be global entities. The entire universe seems to “know” about the existence of each and every fermion in it.

How can that be possible? I can think of one way: topological defects. This is not a new idea. It pops up quite often in various fields of physics.

Topological defect

Why would a topological defect explain the apparent global nature of fermions? It is because all kinds of topological defects can be identified with the aid of an integral that computes the winding number of the topological defect. This type of integral is evaluated over a (hyper)surface that encloses the topological defect. In other words, the field values far away from the defect are included in the integral and not the field value at the defect. Therefore, knowledge about the defect in encoded in the entire field. It therefore suggest that fermions can behave as global entities if they topological defect. This is just a hypothesis. It needs more careful investigation.

Why not an uncertainty principle?

It may seem strange that there are no fundamental physical principle for quantum physics that is associated with uncertainty among those that I have proposed recently. What would be the reason? Since we refer to it as the Heisenberg uncertainty principle, it should qualify as one of the fundamental principles of quantum physics, right? Well, that is just it. Although it qualifies as being a principle, it is not fundamental.

It may help to consider how these concepts developed. At first, quantum mechanics was introduced as an improvement of classical mechanics. Therefore, quantities like position and momentum played an important role.

A prime example of a system in classical mechanics is the harmonic oscillator. Think of a metal ball hanging from a spring. Being pulled down and let go, the ball will start oscillating, moving periodically up and down. This behavior is classically described in terms of a Hamiltonian that contains the position and momentum of the metal ball.

In the quantum version of this system, the momentum is replaced by the wave vector times the Planck constant. But position and the wave vector are conjugate Fourier variables. That is the origin of the uncertainty. Moreover, it also leads to non-commutation when position and momentum are represented as operators. The sum and difference of these two operators behave as lowering and raising operators for quanta of energy in the system. The one reduces the energy in discrete steps and the other increases it in discrete steps.

It was then found that quantum mechanics can also be used to improve classical field theory. But there are several differences. Oscillations in fields are not represented as displacements in position that is exchange into momentum. Instead, their oscillations manifest in terms of the field strength. So, to develop a quantum theory of fields, one would start with the lowering and raising operators, which are now called creation and annihilation operators or ladder operators. Their sum and difference produce a pair of operators that are analogues to the position and momentum operators for the harmonic oscillator. In this context, these are called quadrature operators. They portray the same qualitative behavior as the momentum and position operators. They represent conjugate Fourier variables and therefore again produce an uncertainty and non-commutation. The full development of quantum field theory is far more involved then what I described here, but I only focused on the origin of the uncertainty in this context here.

So, in summary, uncertainty emerges as an inevitable consequence of the Fourier relationship between conjugate variables. In the case of mechanical systems, these conjugate variables come about because of the quantum relationship between momentum and wave vector. In the case of fields, these conjugate variables comes from the ladder operators, leading to analogues properties as found for the formal description of the harmonic oscillator. Hence, uncertainty is not a fundamental property in quantum physics.