COVID-19 numbers

In a time like this, there is much uncertainty. It may be the first time that the world is experiencing such a pandemic, but similar situations have been encountered before. People have lots of questions and they are looking for information. Unfortunately, there is a lot of misleading information available. It seems that most people don’t check such information. They just believe what they read. It is for this reason tragic when “official” websites provide misleading information.

One example is the incorrect representations of numerical quantities about the COVID-19 pandemic. Two such quantities that is for obvious reasons quite important for people is the recovery rate and the mortality rate. They give an estimate, based on the currently available statistics, of the chances for a person to recover or die from COVID-19, given that the person has contracted the decease.

The statistics, which is generally available (see for example Wikipedia), consist of three numbers provided for every country on a daily basis. These number are: the number of confirmed cases (CC), the number of deaths (D) and the number of recoveries (R). For example, on 14 June 2020, the quoted number for the whole world are:

CC = 7 763 921
D = 429 632
R = 3 682 950

From these numbers, one can now compute the mortality rate and the recovery rate. The mistake that one often finds is that these rates are computed by dividing D or R by CC. That gives a misleading result, because CC also contains the currently active cases that does not form part of D or R yet.

The correct way to compute the mortality rate is to divide D by the sum of D and R and multiply the result by 100 to express it as a percentage. In a similar way, the recovery rate is obtained to dividing R by the sum of D and R and then multiply it by 100. You will note that when you add up the mortality rate and the recovery rate you get 100%. That makes sense, because one would either die or recover. There is no other option.

Applying these calculations to the above statistics for the world, we find the mortality rate to be just over 10% and the recovery rate just under 90%. These rates differ from country to country. For instance, the current morality rate in the USA is about 15%, while for SA it is only 3.7%.

Why is it different for the different countries? This is an important question, because it affects people’s behavior. There are many possible reasons, including the age demographic of a country and the availability of medical facilities in the country. The mortality rates for different age groups are different: it increases for older people. If the number of active cases becomes too high, there may not be enough hospital beds and equipment to treat all those that need treatment. As a result, one can expect the morality rate to increase.

The government of a country needs to try and keep the number of active cases low enough so that those that need to treatment can get it. For that reason, they impose restrictions that are aimed to reduce the rate at which the virus is spreading. Restrictions may seem to be a violation of people’s freedom, but in this case it is necessary. However, a government can only do so much. If the people decide to ignore the regulations imposed by the government, because they want to exercise their freedom, then the virus would spread too fast, with the result that the number of active cases can increase above the level where the country would have enough medical facilities.

There are more numbers that are important, for instance the growth rate in the number of confirmed cases. That is a topic for another day.

This image has an empty alt attribute; its file name is 1C7DB1746CFC72286DF097344AF23BD2.png

The non-locality mess

Demystifying quantum mechanics VII

The title comes from a section heading in a paper a recently saw. Due to a serious issue with some confusion that exists in the literature, the author advocates that the physics community abandon the notion of non-locality in favor of correlations that can be observed experimentally.

The problem with the community is that it consist of a very diverse collection of people with diverse perspectives. So, the chances are small that they’ll abandon the notion of non-locality. However, it is not unreasonable that one may be able to clarify the confusion so that the community will al least know what they are talking about.

The problem comes in because people mean different things when they use the term “non-local.” The traditional meaning is associated with “spooky action at a distance.” In other words, it refers to a non-local interaction. This meaning is best understood in the context of special relativity.

Consider two separate events, which one can think of as points in space at certain moments in time. These events are separated in different ways. Let’s call them A and B. If we start from A and can reach B by traveling at a speed smaller than the speed of light, then we say that these events have a time-like separation. In such a case, B could be caused by A. The effect caused by A would then have travelled to B where a local interaction has caused it. If we need to travel at the speed of light to reach B, starting from A, the separation is called light-like and then B could only be caused by A as a result of something traveling at the speed of light. If the separation is such that we cannot reach B from A even if we travel at the speed of light, then we call the separation space-like. In such a case B could not have been caused by A unless there are some non-local interactions possible. There is a general consensus that non-local interactions are not possible. One of the problems that such interactions would have is that one cannot say which event happened first when they have a space-like separation. Simply by changing the reference frame, one can chance the order in which they happen.

As a result of this understanding, the notion of non-local interactions is not considered to be part of the physical universe we live in. That is why some people feel that we should not even mention “non-locality” in polite conversation.

However, there is a different meaning that is sometimes attached to the term “non-locality.” To understand this, we need three events: A, B and C. In this case, A happens first. Furthermore, A and B have a time-like separation and A and C also have a time-like separation, but B and C have a space-like separation. As a result, A can cause both B and C, but B and C cannot be caused by each other.

Imagine now that B and C represent measurements. It would correspond to what one may call “simultaneous” measurements, keeping in mind that such a description depends on the reference frame. Imagine now that we observe a correlation in these measurements. Without thinking about this carefully, a person may erroneously conclude that one event must have caused the other event, which would imply a non-local interaction. However, based on the existence of event A, we know that the cause for the correlation is not due to a non-local interaction, but rather because they have a common cause. In this context, the term “non-local” simply refers to the fact that the observations correspond to events with a space-like separation. It does not have anything to do with an interaction.

When it comes to an understanding of entanglement, which we’ll address later in more detail, it is important to understand the difference between these two notions of non-locality. Under no circumstances are the correlations that one would observe between measurements at space-like separated events B and C to be interpreted as an indication of non-local interactions. The preparation of an entangled state always require local interactions at A so that the correlated observations of such a state at B and C have A as their common cause. The nature of the correlations would tell us whether these correlations are associated with a classical state or a quantum state.

This image has an empty alt attribute; its file name is 1C7DB1746CFC72286DF097344AF23BD2.png

Einstein, Podolski, Rosen

Demystifying quantum mechanics VI

When one says that one wants to demystify quantum mechanics, then it may create the false impression that there is nothing strange about quantum mechanics. Well, that would be a misleading notion. Quantum mechanics does have a counterintuitive aspect (perhaps even more than one). However, that does not mean that quantum mechanics need to be mysterious. We can still understand this aspect, and accept its counterintuitive aspect as part of nature, even though we don’t experience it in everyday life.

The counterintuitive aspect of quantum mechanics is perhaps best revealed by the phenomenon of quantum entanglement. But before I discuss quantum entanglement, it may be helpful to discuss some of the historical development of this concept. Therefore, I’ll focus on an apparent paradox that Einstein, Podolski and Rosen (EPR) presented.

They proposed a simple experiment to challenge the idea that one cannot measure position and momentum of a particle with arbitrary accuracy, due to the Heisenberg uncertainty. In the experiment, an unstable particle would be allowed to decay into two particles. Then, one would measure the momentum of one of the particles and the position of the other particle. Due to the conservation momentum, one can then relate the momentum of the one particle to that of the other. The idea is now that one should be able to make the respective measurements as accurately as possible so that the combined information would then give one the position and momentum of one particle more accurately than what Heisenberg uncertainty should allow.

Previously, I explained that the Heisenberg uncertainty principle has a perfectly understandable foundation, which has nothing to do with quantum mechanics apart from the de Broglie relationship, which links momentum to the wave number. However, what the EPR trio revealed in their hypothetical experiment is a concept which, at the time, was quite shocking, even for those people that thought they understood quantum mechanics. This concept eventually led to the notion of quantum entanglement. But, I’m getting ahead of myself.

John Bell

The next development came from John Bell, who also did not quite buy into all this quantum mechanics. So, to try and understand what would happen in the EPR experiment, he made a derivation of the statistics that one can expect to observe in such an experiment. The result was an inequality, which shows that, under some apparently innocuous assumptions, the measurement results when combine in a particular way must always give a value smaller than a certain maximum value. These “innocuous” assumptions were: (a) that there is a unique reality, (b) that there are no nonlocal interactions (“spooky action at a distance”) .

It took a while before an actual experiment that tested the EPR paradox could be perform. However, eventually such experiments were performed, notably by Alain Aspect in 1982. He used polarization of light instead of position and momentum, but the same principle applies. And guess what? When he combined the measurement result as proposed for the Bell inequality, he found that it violated the Bell inequality!

So, what does this imply? It means that at least one of the assumption made by Bell must be wrong. Either, the physical universe does not have a unique reality, or there are nonlocal interactions allowed. The problem with the latter is that it would then also contradict special relativity. So, then we have to conclude that there is no unique reality.

It is this lack of a unique reality that lies at the heart of an understand of the concept of quantum entanglement. More about that later.

This image has an empty alt attribute; its file name is 1C7DB1746CFC72286DF097344AF23BD2.png

What is your aim?

The endless debate about where fundamental physics should be going, proceeds unabated. As can be expected, this soul searching exercise includes many discussions of a philosophical nature. The ideas of Popper and Kuhn are reassessed for the gazillionth time. Where is all this leading us?

The one thing I often identify in these discussions is the narrow-minded view people have of the diversity of humanity. Philosophers and physicists alike, come up with all sorts of ways to describe what science is supposed to be and what methodologies are supposed to be followed. However, they miss the fact that none of these “extremely good ideas” have any reasonable probability to be successful in the long run.

Why am I so pessimistic? Because humanity has the ability to corrupt almost anything that you can come up with. Those structures and systems that exist in our cultures that actual do work are not the result of some “bright individuals” that decided on some sunny day to suck some good ideas out of their thumbs. No, these structures have evolved into the forms that they have today over a long time. They work because they have been tested over generations by people trying to corrupt them with the devious ideas. (It reminds me that cultural anthropology is, according to me, one of the most underrated fields of study. The scientific knowledge of how cultures evolve would help many governments to make better decisions.)

The scientific method is one such cultural system that has evolved over many centuries. The remarkable scientific and technological knowledge that we posses today stand as clear evidence of the robustness of this method. There is not much, if anything, to be improved in this system.

However, we do need to understand that one cannot obtain all possible knowledge with the scientific method. It does have limitations, but these limitations are not failing of the method that can be improved on. These limitations lie in the nature of knowledge itself. The simple fact is that there are things that we cannot know with any scientific certainty.

What is your reward?

So, the current problem in fundamental science is not something that can be overcome by “improving” the scientific method. The problem lies elsewhere. According to my understanding, this problem has one of two possible reasons, which I have discussed previously. It is either because people have lost their true curiosity in favor of vanity. Or it is because our knowledge is running into a wall that cannot be penetrated by the scientific method.

While the latter has no solution, the former may be overcome if people realize that a return to curiosity instead of vanity as the driving force behind scientific research may help to adjust their focus to achieve progress. Short term extravagant research results do not always provide the path to more knowledge. It is mainly designed to increase some individual’s impact with the aim to obtain fame and glory. The road to true knowledge may sometimes lead through mundane avenues that seem boring to the general public. Only the truly passionate researcher with no interest in fame and glory would follow that avenue. However, it may perhaps be what is needed to make the breakthrough that would advance fundamental physics.

This image has an empty alt attribute; its file name is 1C7DB1746CFC72286DF097344AF23BD2.png

Discreteness

Demystifying quantum mechanics V

Perhaps one of the most iconic “mysteries” of quantum mechanics is the particle-wave duality. Basically, it comes down to the fact that the interference effects one can observe implies that quantum entities behave like waves, but at the same time, these entities are observed as discrete lumps, which are interpreted as particles. Previously, I explained that one can relax the idea of localized lumps a bit to allow only the interactions, which are required for observations, to be localized. So instead of particles, we can think of these entities as partites that share all the properties of particles, accept that they are not localized lumps. So, they can behave like waves and thus give rise to all the wave phenomena that are observed. In this way, the mystery of the particle-wave duality is removed.

Now, it is important to understand that, just like particles, partites are discrete entities. The discreteness of these entities is an important aspect that plays a significant role in the phenomena that we observe in quantum physics. Richard Feynman even considered the idea that “all things are made of atoms” to be the single most important bit of scientific knowledge that we have.

Model of the atom

How then does it happen that some physicist would claim that quantum mechanics is not about discreteness? In her blog post, Hossenfelder goes on to make a number of statements that contradict much of our understanding of fundamental physics. For instance, she would claim that “quantizing a theory does not mean you make it discrete.”

Let’s just clarify. What does it mean to quantize a theory? It depends, whether we are talking about quantum mechanics or quantum field theory. In quantum mechanics, the processing of quantizing a theory implies that we replace observable quantities with operators for these quantities. These operators don’t always commute with each other, which then leads to the Heisenberg uncertainty relation. So the discreteness is not immediately apparent. On the other hand, in quantum field theory, the quantization process implies that fields are replaced by field operators. These field operators are expressed in terms of so-called ladder operators: creation and annihilation operators. What a ladder operator does is to change the excitation of a field in discrete lumps. Therefore, discreteness is clearly apparent in quantum field theory.

What Hossenfelder says, is that the Heisenberg uncertainty relationships is the key foundation for quantum mechanics. In one of her comments, she states: “The uncertainty principle is a quantum phenomenon. It is not a property of classical waves. If there’s no hbar in it, it’s not the uncertainty principle. People get confused by the fact that waves obey a property that looks similar to the uncertainty principle, but in this case it’s for the position and wave-number, not momentum. That’s not a quantum phenomenon. That’s just a mathematical identity.”

It seems that she forgot about Louise de Broglie’s equation, which relates the wave-number to the momentum. In a previous post, I have explained that the Heisenberg uncertain relationship is an inevitable consequence of the Planck and de Broglie equations, which relate the conjugate variables of the phase space with Fourier variables. It has nothing to do with classical physics. It is founded in the underlying mathematics associated with Fourier analysis. Let’s not allow us to be mislead by people that are more interested in sensationalism than knowledge and understanding.

The discreteness of partites allows the creation of superpositions of arbitrary combinations of such partites. The consequences for such scenarios include quantum interference that is observed in for instance the Hong-Ou-Mandel effect. It can also lead to quantum entanglement, which is an important property used in quantum information systems. The discreteness in quantum physics therefore allows it to go beyond what one can find in classical physics.

This image has an empty alt attribute; its file name is 1C7DB1746CFC72286DF097344AF23BD2.png