top of page
Writer's pictureBjørn Ekeberg

Cosmology Has Some Big Problems

Updated: Jun 4, 2019

Originally published in Scientific American, 30 April.


What do we really know about our universe?


Born out of a cosmic explosion 13.8 billion years ago, the universe rapidly inflated and then cooled, it is still expanding at an increasing rate and mostly made up of unknown dark matter and dark energy ... right?


This well-known story is usually taken as a self-evident scientific fact, despite the relative lack of empirical evidence—and despite a steady crop of discrepancies arising with observations of the distant universe.


In recent months, new measurements of the Hubble constant, the rate of universal expansion, suggested major differences between two independent methods of calculation. Discrepancies on the expansion rate have huge implications not simply for calculation but for the validity of cosmology's current standard model at the extreme scales of the cosmos.



Another recent probe found galaxies inconsistent with the theory of dark matter, which posits this hypothetical substance to be everywhere. But according to the latest measurements, it is not, suggesting the theory needs to be reexamined.


It's perhaps worth stopping to ask why astrophysicists hypothesize dark matter to be everywhere in the universe? The answer lies in a peculiar feature of cosmological physics that is not often remarked. For a crucial function of theories such as dark matter, dark energy and inflation, which each in its own way is tied to the big bang paradigm, is not to describe known empirical phenomena but rather to maintain the mathematical coherence of the framework itself while accounting for discrepant observations.


Fundamentally, they are names for something that must exist insofar as the framework is assumed to be universally valid.


Each new discrepancy between observation and theory can of course in and of itself be considered an exciting promise of more research, a progressive refinement toward the truth. But when it adds up, it could also suggest a more confounding problem that is not resolved by tweaking parameters or adding new variables.


Consider the context of the problem and its history. As a mathematically driven science, cosmological physics is usually thought to be extremely precise. But the cosmos is unlike any scientific subject matter on earth. A theory of the entire universe, based on our own tiny neighborhood as the only known sample of it, requires a lot of simplifying assumptions. When these assumptions are multiplied and stretched across vast distances, the potential for error increases, and this is further compounded by our very limited means of testing.


Historically, Newton's physical laws made up a theoretical framework that worked for our own solar system with remarkable precision. Both Uranus and Neptune, for example, were discovered through predictions based on Newton's model. But as the scales grew larger, its validity proved limited. Einstein's general relativity framework provided an extended and more precise reach beyond the furthest reaches of our own galaxy. But just how far could it go?


The big bang paradigm that emerged in the mid-20th century effectively stretches the model's validity to a kind of infinity, defined either as the boundary of the radius of the universe (calculated at 46 billion light-years) or in terms of the beginning of time. This giant stretch is based on a few concrete discoveries, such as Edwin Hubble's observation that the universe appears to be expanding (in 1929) and the detection of the microwave background radiation (in 1964). But considering the scale involved, these limited observations have had an outsized influence on cosmological theory.


It is of course entirely plausible that the validity of general relativity breaks down much closer to our own home than at the edge of the hypothetical end of the universe. And if that were the case, today's multilayered theoretical edifice of the big bang paradigm would turn out to be a confusing mix of fictional beasts invented to uphold the model along with empirically valid variables, mutually reliant on each other to the point of making it impossible to sort science from fiction.


Compounding this problem, most observations of the universe occur experimentally and indirectly. Today's space telescopes provide no direct view of anything—they produce measurements through an interplay of theoretical predictions and pliable parameters, in which the model is involved every step of the way. The framework literally frames the problem; it determines where and how to observe. And so, despite the advanced technologies and methods involved, the profound limitations to the endeavor also increase the risk of being led astray by the kind of assumptions that cannot be calculated.


After spending many years researching the foundations of cosmological physics from a philosophy of science perspective, I have not been surprised to hear some scientists openly talking about a crisis in cosmology. In the big “inflation debate” in Scientific American a few years ago, a key piece of the big bang paradigm was criticized by one of the theory's original proponents for having become indefensible as a scientific theory.

Why? Because inflation theory relies on ad hoc contrivances to accommodate almost any data, and because its proposed physical field is not based on anything with empirical justification. This is probably because a crucial function of inflation is to bridge the transition from an unknowable big bang to a physics we can recognize today. So, is it science or a convenient invention?


A few astrophysicists, such as Michael J. Disney, have criticized the big bang paradigm for its lack of demonstrated certainties. In his analysis, the theoretical framework has far fewer certain observations than free parameters to tweak them—a so-called “negative significance” that would be an alarming sign for any science. As Disney writes in American Scientist: “A skeptic is entitled to feel that a negative significance, after so much time, effort and trimming, is nothing more than one would expect of a folktale constantly re-edited to fit inconvenient new observations."


As I discuss in my new book, Metaphysical Experiments, there is a deeper history behind the current problems. The big bang hypothesis itself originally emerged as an indirect consequence of general relativity undergoing remodeling. Einstein had made a fundamental assumption about the universe, that it was static in both space and time, and to make his equations add up, he added a “cosmological constant,” for which he freely admitted there was no physical justification.


But when Hubble observed that the universe was expanding and Einstein's solution no longer seemed to make sense, some mathematical physicists tried to change a fundamental assumption of the model: that the universe was the same in all spatial directions but variant in time. Not insignificantly, this theory came with a very promising upside: a possible merger between cosmology and nuclear physics. Could the brave new model of the atom also explain our universe?


From the outset, the theory only spoke to the immediate aftermath of an explicitly hypothetical event, whose principal function was as a limit condition, the point at which the theory breaks down. Big bang theory says nothing about the big bang; it is rather a possible hypothetical premise for resolving general relativity.


On top of this undemonstrable but very productive hypothesis, floor upon floor has been added intact, with vastly extended scales and new discrepancies. To explain observations of galaxies inconsistent with general relativity, the existence of dark matter was posited as an unknown and invisible form of matter calculated to make up more than a quarter of all mass-energy content in the universe—assuming, of course, the framework is universally valid. In 1998, when a set of supernova measurements of accelerating galaxies seemed at odds with the framework, a new theory emerged of a mysterious force called dark energy, calculated to fill circa 70 percent of the mass-energy of the universe.


The crux of today's cosmological paradigm is that in order to maintain a mathematically unified theory valid for the entire universe, we must accept that 95 percent of our cosmos is furnished by completely unknown elements and forces for which we have no empirical evidence whatsoever. For a scientist to be confident of this picture requires an exceptional faith in the power of mathematical unification.


In the end, the conundrum for cosmology is its reliance on the framework as a necessary presupposition for conducting research. For lack of a clear alternative, as astrophysicist Disney also notes, it is in a sense stuck with the paradigm. It seems more pragmatic to add new theoretical floors than to rethink the fundamentals.


Contrary to the scientific ideal of getting progressively closer to the truth, it looks rather like cosmology, to borrow a term from technology studies, has become path-dependent: overdetermined by the implications of its past inventions.



Big Bang cosmologist Ethan Siegel responded to my critique in Forbes, 07 May.


216 views0 comments

Recent Posts

See All

Comments


bottom of page