Naturalness

One of the main objectives for the Large Hadron Collider (LHC) was to solve the problem of naturalness. More precisely, the standard model contains a scalar field, the Higgs field, that does not have a mechanism to stabilize its mass. Radiative corrections are expected to cause the mass to grow all the way to the cut-off scale, which is assumed to be the Planck scale. If the Higgs boson has a finite mass far below the Planck scale (as was found to be the case), then it seems that there must exist some severe fine tuning giving cancellations among the different orders of the radiative corrections. Such a situation is considered to be unnatural. Hence, the concept of naturalness.

It was believed, with a measure of certainty, that the LHC would give answers to the problem of naturalness, telling us how nature maintains the mass of the Higgs far below the cut-off scale. (I also held to such a conviction, as I recently discovered reading some old comments I made on my blog.)

Now, after the LHC has completed its second run, it seems that the notion that it would provide answers for the naturalness problem is confronted with some disappointment (to put it mildly). What are we to conclude from this? There are those saying that the lack of naturalness in the standard model is not a problem. It is just the way it is. It is stated that the requirement for naturalness is an unjustified appeal to beauty.

No, no, no, it has nothing to do with beauty. At best, beauty is just a guide that people sometimes use to select the best option among a plethora of options. It falls in the same category as Occam’s razor.

On the other hand, naturalness is associated more with the understanding of scale physics. The way scales govern the laws of nature is more than just an appeal to beauty. It provides us with a means to guess what the dominant behavior of a phenomenon would be like, even when we don’t have an understanding of the exact details. As a result, when we see a mechanism that deviates from our understanding of scale physics, it gives a strong hint that there are some underlying mechanisms that we have not yet uncovered.

For example, in the standard model, the masses of the elementary particles range over several orders of magnitude. We cannot predict these mass values. They are dimension parameters that we have to measure. There is no fundamental scale parameter close to the masses that can give any indication of where they come from. Our understanding of scale physics tells us that there must be some mechanism that gives rise to these masses. To say that these masses are produced by the Yukawa couplings to the Higgs field does not provide the required understanding. It replaces one mystery with another. Why would such Yukawa couplings vary over several orders of magnitude? Where did they come from?

So the naturalness problem, which is part of a bigger mystery related to the mass scales in the standard model, still remains. The LHC does not seem to be able to give us any hints to solve this mystery. Perhaps another larger collider will.

Mopping up

The particle physics impasse prevails. That is my impression, judging from the battles raging on the blogs.

Among these, I recently saw an interesting comment by Terry Bollinger to a blog post by Sabine Hossenfelder. According to Terry, the particle physics research effort lost track (missed the right turnoff) already in the 70’s. This opinion is in agreement with the apparent slow down in progress since the 70’s. Apart from the fact that neutrino’s have mass, we did not learn much more about fundamental physics since the advent of the standard model in the 70’s.

However, some may argue that the problem already started earlier. Perhaps just after the Second World War. Because that was when the world woke up to the importance of fundamental physics. That was the point where vanity became more important than curiosity for the driving force to do research. The result was an increase in weird science – crazy predictions that are more interested in drawing attention than increasing understanding.

Be that as it may. (I’ve written about that in my book.) The question is, what to do about that? There are some concepts in fundamental physics that are taken for granted, yet have never been established as scientific fact through a proper scientific process. One such concept pointed out by Terry is the behaviour of spacetime at the Planck scale.

Today the Planck scale is referred to as if it is establish scientific fact, where in fact it is a hypothetical scale. The physical existence of the Planck scale has not and probably cannot be confirmed through scientific experiments, at least not with out current capability. Chances are it does not exist.

The existence of the Planck scale is based on some other concepts that are also not scientific facts. One is the notion of vacuum fluctuations, a concept that is often invoked to come up with exotic predictions. What about the vacuum is fluctuating? It follows from a very simple calculation that the particle number of the vacuum state is exactly zero with zero uncertainty. So it seems that the notion of vacuum fluctuations is not as well understood as is generally believed.

Does it mean that we are doomed to wander around in a state of confusion? No, we just need to return to the basic principles of the scientific method.

So I propose a mopping up exercise. We need to go back to what we understand according to the scientific method and then test those parts that we are not sure about using scientific experiments and observations. Those aspects that are not testable in a scientific manner needs to be treated on a different level.

For instance, the so-called measurement problem involves aspects that are in principle not testable. As such, they belong to the domain of philosophy and should not be incorporated into our scientific understanding. There are things we can never know in a scientific manner and it is pointless to make them prerequisites for progress in our understanding of the physical world.

A brave new quantum world

They say one votes through one’s actions. Where you spend your money is where cast your vote. For example, if you want to support the recycling effort then you would buy only products that somehow support the recycling effort.

Would it be possible that someone may in this way cast a vote in favour of some human endeavour while not supporting that endeavour? Yes, that is often the case (I think) when it comes to earning your money. People find themselves in work situations where they are effectively supporting the activities of the organization that they work for even though in their hearts they are not really in favour of the activities associated with the organization.

For a long while I was under the impression that this situation is valid for the current so-called quantum revolution. There are many people doing research in this field, but I was doubtful whether they all really believe that this “revolution” is a real thing. There is a large amount of hype surrounding the expectations of these technologies. Most people working in this field must be aware of the fact that not all the promises are realistic.

Perhaps, part of this notion was based on my one skepticism about this field. I was thinking that most if not all of these new quantum technologies are just activities being pursued for the sake of getting research funding and ego trips.

Well now I’m starting to form a different picture of the situation. The difference come from seeing the efforts made by commercial companies. The way that such companies approach the challenges is very different from the way that academic researchers do it. While the academic approach is often purely for the wow-factor of what is being achieved, the industry must do this in a sustainable way. When they market a product, it must work according to specification and keep on working for a reasonable time after being sold. As a result, the industry is far more serious when is comes to the design and implementation of these systems than any academic researcher would ever be.

So it is when I saw the seriousness with which a commercial enterprise is addressing the challenges of quantum computing that I realize that this is not going to be something that will just blow over after some time. We are very likely to see a world where quantum computers enter our lives in the not too distant future. Yes, there are still challenges, but they are not insurmountable. What the specific details of the technology are going to be I cannot tell you, but I can see that the way quantum computing is being addressed gives it a very high chance of success.

Are you ready for that? How is that going to change our lives? That remains to be see.

Physics vs formalism

This is something I just have to get off my chest. It’s been bugging me for a while now.

Physics is the endeavour to understand the physical world. Mathematics is a powerful tool employed in this endeavour. It often happens that specific mathematical procedures are developed for specific scenarios found in physics. These developments then often lead to dedicated mathematical methods, even special notations, that we call formalisms.

The idea of a formalism is that it makes life easier for us to investigate physical phenomena belonging to a specific field. An example is quantum mechanics. The basic formalism has been developed almost a hundred years ago. Since then, many people have investigated various sophisticated aspects of this formalism and placed it on a firm foundation. Books are dedicated to it and university courses are designed to teach students all the intricate details.

One can think of it almost like a kitchen appliance with a place to put in some ingredients, a handle to crank, and a slot at the bottom where the finished product will emerge once the process is completed. Beautiful!

So does this mean that we don’t need to understand what we are doing anymore? We simply need to put the initial conditions into the appropriate slot, the appropriate Hamiltonian into its special slot and crank away. The output should then be guaranteed to be the answer that we are looking for.

Well, it is like the old saying: garbage in, garbage out. If you don’t know what you are doing, you may be putting the wrong things in. The result would be a mess from which one cannot learn anything.

Actually, the situation is even more serious than this. For all the effort that has gone into developing the formalism (and I’m not only talking about quantum mechanics), it remains a human construct of what is happening in the real physical world. It inevitably still contains certain prejudices, left over as a legacy of the perspectives of the people that initially came up with it.

Take the example of quantum mechanics again. It is largely based on an operator called the Hamiltonian. As such, it displays a particular prejudice. It is manifestly non-relativistic. Moreover, it assumes that we know the initial state at a given time, for all space. We then use the Hamiltonian approach to evolve the state in time to see what one would get at some later point in time. But what if we know the initial state for all time, but not for all space and we want to know what the state looks like at other regions in space? An example of such a situation is found in the propagation of a quantum state through a random medium.

Those that are dead sold on the standard formal quantum mechanics procedure would try to convince you that the Hamiltonian formalism would still give you the right answer. Perhaps one can use some fancy manipulations of the input state in special cases to get situations where the Hamiltonian approach would work for this problem. However, even in such cases, the process becomes awkward and far from efficient. The result would also be difficult to interpret. But why would you want to do it this way, in the first place? Is it so important that we always use the established formalism?

Perhaps you think we have no choice, but that is not true. We understand enough of the fundamental physics to come up with an efficient mathematical model for the problem, even though the result would not be recognizable as the standard formalism. Did we become so lazy in our thoughts that we don’t want to employ our understanding of the fundamental physics anymore? Or did we lose our understanding of the basics to the point that we cannot do calculations unless we use the established formalism?

What would you rather sacrifice: the precise physical understanding or the established mathematical formalism? If you choose to sacrifice the former rather than the latter, then you are not a physicist, then you are a formalist! In physics, the physical understanding should always be paramount! The formalism is merely a tool with which we strive to increase our understanding. If the formalism is not appropriate for the problem, or does not present us with the most efficient way to do the computation, then by all means cast it aside without a second thought.

Focus on the physics, not on the formalism! There I’ve said it.

Art in research

Does it help to apply some form of creativity in scientific research? Stated differently, does creativity have any role to play in scientific research? I would like to think so.

At first one may think that creativity is only associated with the act of conjuring up things that don’t really exist. A painter paints a land scape scene and applies creativity to render the trees and the clouds in interesting ways. As such, they are different from the trees and cloud in the real scene. In as far as the artist employs creativity, the result become different from reality.

If this is what creativity produces, then it would have no place in scientific research, because in this context, we are not interested in anything that would deviate from reality. But creativity does not only representing that which doesn’t exists. It can also be associated with a much more abstract activity.

When a theoretical researcher tries to come up with a model that describes an aspect for physical reality, he or she needs to create something that has not existed before. It is not initially known whether this model gives the correct description of reality. In that sense, one does not known whether it represents anything that is real. One would know that only after the model has been tested. But before that step can be taken, one needs to create the model. For this first step, the researcher is required to employ creativity.

The act of creating such a model is an act of bring into existence something that has not existed before. The inspiration for this model may be obtained from other similar models or from other models in unrelated fields of study. In the same way, artists get inspiration from the works of other artists. despite the source of inspiration, the resulting model is novel in one way or another. That is where the creativity lies.

So, art and science are not that different after all. Both require the same mental faculties. Perhaps they just call it by different names.

Particle physics impasse

Physics is the study of the physical universe. As a science, it involves a process consisting of two components. The theoretical component strives to construct theoretical models for the physical phenomena that we observe. The experimental component tests these theoretical models and explores the physical world for more information about phenomena.

Progress in physics is enhanced when many physicists using different approaches tackle the same problem. The diversity in the nature of problems need to be confronted by a diversity of perspectives. This diversity is reflected in the literature. The same physical phenomenon is often studied by different approaches, using different mathematical formulations. Some of them may turn out to produce the same results, but some may differ in their predictions. The experimental work can then be used to make a selection among them.

That is all fine and dandy for physics in general, but the situation is a bit more complicated for particle physics. Perhaps, one can see the reason for all these complications as the fact that particle physics is running out of observable energy space.

What do I mean by that? Progress in particle physics is (to some extent at least) indicated by understanding the fundamental mechanisms of nature at progressively higher energy scales. Today, we understand these fundamental mechanisms to a fairly good degree up to the electroweak scale (at about 200 GeV). It is described by the Standard Model, which was established during the 1970’s. So, for the past 4 decades, particle physicists tried to extend the understand beyond that scale. Various theoretical ideas were proposed, prominent among these were the idea of supersymmetry. Then a big experiment, the Large Hadron Collider (LHC) was constructed to test these ideas above the electroweak scale. It discovered the Higgs boson, which was the last extent particle predicted by the standard model. But no supersymmetry. In fact, none of the other ideas panned out at all. So there is a serious back-to-the-drawing-board situation going on in particle physics.

The problem is, the LHC did not discover anything else that could give a hint at what is going on up there, or did it? There will be another run to accumulate more data. The data still needs to be analyzed. Perhaps something can still emerge. Who knows? However, even if some new particle is lurking within the data, it becomes difficult to see. Such particles tend to be more unstable at those higher energies, leading to very broad peaks. To make things worse, there is so much more background noise. This makes it difficult, even unlikely, that such particles can be identified at these higher energies. At some point, no experiment would be able to observe such particles anymore.

The interesting things about the situation is the backlash that one reads about in the media. The particle physicists are arguing among themselves about the reason for the current situation and what the way forward should be. There are those that say that the proposed models were all a bunch of harebrained ideas that were then hyped and that we should not build any new colliders until we have done some proper theoretical work first.

See, the problem with building new colliders is the cost involved. It is not like other fields of physics where the local funding organization can support several experimental groups. These colliders require several countries to pitch in to cover the cost. (OK, particle physics is not the only field with such big ticket experiments.)

The combined effect of the unlikeness to observe new particles at higher energies and the cost involved to build new colliders at higher energies, creates an impasse in particle physics. Although they may come up with marvelous new theories for the mechanisms above the electroweak scale, it may be impossible to see whether these theories are correct. Perhaps the last energy scale below which we will be able to understand the fundamental mechanisms in a scientific manner, will turn out to be the electroweak scale.

Glad I did not stay in particle physics.

How far away is that star?

On a clear night, far away from the city lights, one can look up and enjoy the beauty of the starry sky. This display must have enticed people for as long as people existed and I’m sure the question has often come up: how far away are those stars?

Well, there is an interesting tale of discovery related to the progression of measuring sticks that give the ability to determine the distances to astronomical objects. Part of this tale is how Edwin Hubble discovered that the universe is expanding.

The realization that we live in an expanding universe complicates the answer to the question of how far away astronomical objects are. Apart from the fact that the distances change, there is also the issue of what distance we observe at a given point in time. If I use the apparent brightness of a star with a known absolute brightness, then one may think (at least I would have) that the implied distance is between us (the earth) and the location of the star at the time the light was emitted. This is not the case.

Diagram of light from a star or galaxy propagating to be observed on earth

The above diagram tries to explain what happens. The black dots represent a star or galaxy (the source of the light) at different locations in an expanding universe. The blue dot is the earth which is kept it at a fixed location in the expanding universe. The red circles represent the expanding sphere of light after being emitted by the source at some point in the past. Assuming that the universe expands uniformly, we see the source would always remain at the center of the expanding sphere. Moreover, since the observed apparent brightness is given by the total emitted power divided by the total surface area of the sphere, the associated distance is the distance from the earth to the current location of the source. This is called the proper distance to the source.

Amazing, we are able to know the distance to an object at its current location even if we cannot see that object now. Who knew?