There is a notion in the quest for a fundamental understanding of the physical world that I wish to challenge. It is the idea the infinities are bad and should be avoided at all cost. It seems to be one of the main justifications for string theory. Perhaps that is why so many people still believe that string theory is the best candidate for the fundamental theory that would explain everything.

Leave aside the idea that we’ll ever have such a theory of everything. In my view any fundamental understanding of nature, as represented in terms of some mathematical formalism would necessarily incorporate and successfully deal with infinities. The idea that one can impose some fundamental cutoff scale that would render all calculations finite is in my view contrived. The Planck scale is a hypothetical thing that has never and will never be established as a scientific fact.

It is true that we never observe an infinity in any experiment. So all predicted observables must be rendered as finite values. That does not mean that the formalisms that produce these quantities would be free of such infinities. It only means that such infinities must cancel when the formalism is used to calculate physical observable quantities.

Infinities appear in calculations of fundamental physics because the degrees of freedom at the fundamental level occupy sets with infinitely many elements. The number of elements is represented by the cardinality of these spaces.

A simple example to show how such an infinity shows up is to consider the real line. It has an infinite number of points. It is said to have the cardinality of the continuum. Now consider a very simple function which is equal to one for every point on the real line. When we integrate this function, we effectively measure the length of the real line, which is infinite. Mathematical calculations involving infinities present serious challenges. They tend to give ambiguous answers. These infinite cardinal numbers obey cardinal arithmetic in which various operations applied to a cardinal number give back the same cardinal number. Therefore mathematicians always avoid things (functions) that would give infinities. Most functions on the real line would integrate to an infinite result. Discarding all such functions, we end up with a much smaller set of functions that are now well defined in the sense that they integrate to finite results.

In fundamental physics, we not only deal with functions on the real line, but with functions on multiple real lines. In fact, we end up with functions on infinitely many real lines. It becomes even harder to avoid infinities in this case. You can imagine that if I had a function on the real line that integrates to a finite number, say 2, then the integral of a product of infinitely many such functions would produce 2 to the power of infinity. So, results that involve infinities are ubiquitous in calculations in fundamental physics.

One way that people try to avoid these infinities is with a process called regularization. It is a process whereby the degrees of freedom is somehow reduced or the integrals are truncated with some cutoff. At the end, the quantity that represents the cutoff or the reduction in degrees of freedom is allowed to grow back to infinity. Provided that this quantity has somehow cancelled during the calculation, this limit will not change the answer. One can argue that the process of regularization is a way to apply ordinal arithmetic (the arithmetic of ordinary numbers) to cardinal numbers. Hence, one can forgo the regularization process and just keep careful track of the cardinal numbers in the calculation to ensure that in the normal process of preforming the calculation as if everything is finite, these cardinal numbers would cancel. Sometimes, however, it is necessary to perform a limit process at the end in which the answer becomes finite when the quantities that represent these cardinal numbers are allowed to go to infinity.

This type of operation is what I would expect to see in a fundamental theory. Not some contrivance that magically removes all cardinal numbers from entering the calculations in the first place.