Tuesday, December 3, 2019

Astronomy Observations Supporting Dark Energy's Existence May Actually See Large Scale Structure

Executive summary

Is dark energy the dominant kind of stuff in the universe? 

Maybe not. 

A credible analysis of a data set much larger than the on supporting the existence of dark energy when it was proposed imply that the fact that the data supporting the existence of dark energy comes from supernovae which have a skewed distribution favoring certain parts of the sky suggest that what looked like dark energy may actually be an observation of large scale structure in the universe.


The Standard Model of Cosmology

The "Standard Model of Cosmology" also known as the ΛCDM model, where Λ (pronounced "lambda") is the cosmological constant of General Relativity, and CDM stands for "cold dark matter" (confusingly a term defined in a manner consistent with both "warm dark matter" and "cold dark matter" in non-cosmology contexts).

The ΛCDM model is the leading overarching framework for understanding the structure and logic of our astronomy observations at the sale of the universe as a whole and its large scale structure. Its prominence derives from the fact that it can be fit consistently to astronomy observations of very large cosmological scales with just seven or so parameters determined based upon astronomy observations. 

The ΛCDM model also provides a plausible framework that could produce sometimes reasonable similar to the structure of the existing universe at the scale of galaxies and galaxy clusters, although fitting the astronomy data to observations at this scale has been more problematic.

Even at the cosmological level, the ΛCDM model has critics who are credible, smart, well-qualified PhD astronomers, astrophysicists and cosmologists, who argue that new data requires modification of this model. This post is about one such criticism.

Dark Energy

In General Relativity with a cosmological constant, the cosmological constant is a constant baseline amount of mass-energy per volume of spacetime that exists in a vacuum (in models where this mass-energy is something separate from the spacetime itself this is most often called "quintessence" or "dark energy". In this model, the universe constantly expands in volume from a mere volumeless point at the moment of the Big Bang and expands at the speed of light, and the amount of dark energy in the universe is proportionate to the volume of the universe. 

Dark energy violates conservation of mass-energy in the ordinary sense by increasing as the volume of universe expands.

Ordinary matter and dark matter

In contrast, in the ΛCDM model, the aggregate amount of ordinary matter and cold dark matter in the universe is constant, so the density of ordinary matter and cold dark matter in the universe declines as a function of one over the age of the universe (currently about 14 billion years in round numbers).

Ordinary matter, cold dark matter (and also all forms of mass-energy in the universe other than dark energy, such as kinetic energy and photons), perfectly obeys the conservation of mass-energy (although quantum mechanics allows mass-energy to be temporarily "borrowed" in non-observable intermediate steps of physics phenomena making possible phenomena like quantum tunneling which is necessary, for example, for transistors to work as they do, something that would be impossible in the classical electromagnetic theory summarized in Maxwell's equations). 

Everything other than dark energy in the universe is constantly expanding in all directions with the momentum of the Big Bang modified by gravitational pulls in all directions from everything else in the universe, although everything other than massless particles (basically photons) is doing this at less than the speed of light. So, clumps of matter like galaxies constant get more distant from each other, on average.

In the absence of dark energy, the average rate of this expansion would be constant, driven at first order by the momentum of the Big Bang, and modified up or down at second order (in an amount hat would average out over time), by the proximity of other ordinary matter, dark matter and radiation in its vicinity.

The phenomenological implications of dark energy

But, in the ΛCDM model, the constant increase in the aggregate mass-energy of the universe, by continually adding dark energy at the outer boundary of the universe as the volume of space in the universe expands at the speed of light away from the Big Bang, causes the rate at which ordinary matter and dark matter in the universe expands away from other ordinary matter and dark matter in the universe to accelerate because the newly created dark energy is pulling it outward.

Dark energy's existence is inferred in the ΛCDM model when the the rate at which ordinary matter and dark matter in the universe expands away from other ordinary matter and dark matter in the universe to accelerates.

Measuring dark energy

This acceleration can be inferred from astronomy observations because the finite speed of light means that our astronomy observations of things that are further away (something that can be determined using a phenomena known as "red shift") are further back in time. So, if the rate at which distant objects of expanding away from each other is slower than the rate at which close objects are expanding away from each other, then we can use the difference between those rates to determine the cosmological constant and the amount of dark energy as a proportion of the total mass-energy of the universe at the present time.

Using astronomy observations to determine the relative amounts of dark energy, cold dark matter and ordinary matter in the universe, in a model dependent way using the ΛCDM model, implies that right now, 14 billion years after the Big Bang, dark energy accounts for about 70% of the aggregate mass-energy of the universe at the present time, while about 30% of the aggregate mass-energy of the universe at the present time is other stuff, mostly ordinary matter (about 5% in round numbers) and dark matter (about 25% in round numbers).

The main astronomy observation we use to estimate the amount of dark energy in the universe is the distribution of Type Ia supernovae in the observable universe. These supernovae produce a very distinctive electromagnetic signal which is consistent among supernovae of this type that makes it possible to accurately determine their distance and hence how long ago they occurred based upon how redshifted the electromagnetic signal (mostly with visible light, but also with other wavelengths of light), which can be combined with the location in the sky where it is observed to mark a four dimensional Type Ia supernovae map from which expansion rates can be determined. (These supernovae also happen a predictable rates in a well understood astrophysical process.)

Similarly, according to the ΛCDM model, a little more than 7 billion years ago, about 34% of the mass energy of the universe was dark energy, about 11% was ordinary matter, and about 55% was cold dark matter.

Isotropy v. Anisotropy

But, in order to accurately calculate the amount of dark energy in the universe we have to make some key assumptions which we use to fit our astronomy observations to the ΛCDM model.

One of the key assumptions that goes into converting astronomy observations into a cosmological constant value (which combined with the age of the universe can be used to determine the amount of dark energy in the universe), is that the universe is isotropic. This means that at cosmological scales (i.e. the scale of large chunks of the entire universe), that the observable properties of any given large chunk of the universe is basically identical and symmetrical. 

In contrast, if the universe is anisotropic, then there is very large scale structure in the universe (presumably an imprint of trends put in place by the way that random outcomes of very early events in the universe, possibly at a quantum mechanical level in the first few seconds of the universe even, turned out).

An anisotropic universe isn't a horribly outrageous or absurd idea. At the scale of our local few dozen galaxies, the universe is undeniably anisotropic. There are big clumps of matter that make up galaxies and galaxy clusters, and vast comparatively empty spaces with only a little hydrogen gas and dust and a few stray isolated stars and rocks in them. And the number of galaxies in each spatial direction is not the same.

Whether the universe is isotropic or anisotropic at the cosmological scale of the entire observable universe is, in principle at least, a question that doesn't have a Platonically right or wrong answer as a matter of pure reason that can be determined empirically from astronomy observations.

The New Paper

A six page paper published in a peer reviewed scientific journal two weeks ago suggests that the astronomy observations suggesting that the rate at which the ordinary matter and cold dark matter in the universe is expanding is accelerating is just an optical illusion arising from the fact that the universe is actually anisotropic at cosmological scale. 

As a result, from our vantage point on Earth, we may actually be seeing the large scale structure of the universe and misinterpreting those observations as the acceleration of the universe due to dark energy. The paper asserts that this happens because we are using the ΛCDM model to fit our observations because the ΛCDM model counterfactually assumes that the universe is isotropic. 

While this would be bad news for a core tenant of the ΛCDM model and mean that everything we've been told about cosmology for decades is significantly inaccurate, this could be good news for fundamental physics. This is because there are deep problems involved in figuring out the fundamental laws of physics in models where the core law of conservation of mass-energy is not observed globally by gravity, even though the Standard Model of Particle Physics obeys this law and all local observations of gravity obey this law. And, the assumption of isotropy which is displaced is an assumption about empirical reality, rather than itself being a fundamental law of physics that needs to be true.

In particular, it is much easier to devise a model of quantum gravity in which gravity observes the same law of conservation of matter-energy that the Standard Model does, than it is to do so in which some gravitational phenomena does not observe that law.

The abstract of the paper and its citation are as follows:
Observations reveal a “bulk flow” in the local Universe which is faster and extends to much larger scales than are expected around a typical observer in the standard ΛCDM cosmology. This is expected to result in a scale-dependent dipolar modulation of the acceleration of the expansion rate inferred from observations of objects within the bulk flow.

From a maximum-likelihood analysis of the Joint Light-curve Analysis catalogue of Type Ia supernovae, we find that the deceleration parameter, in addition to a small monopole, indeed has a much bigger dipole component aligned with the cosmic microwave background dipole, which falls exponentially with redshift z: q0 = qm + qd.n̂ exp(-z/S). The best fit to data yields qd = −8.03 and S = 0.0262 (⇒d ∼ 100 Mpc), rejecting isotropy (qd = 0) with 3.9σ statistical significance, while qm = −0.157 and consistent with no acceleration (qm = 0) at 1.4σ. Thus the cosmic acceleration deduced from supernovae may be an artefact of our being non-Copernican observers, rather than evidence for a dominant component of “dark energy” in the Universe.
Jacques Collins, et al., "Evidence for Anisotropy of cosmic acceleration", 631 Astronomy and Astrophysics L13 (November 20, 2019) DOI: https://doi.org/10.1051/0004-6361/201936373

Hat tip to Backreaction. As Sabine Hossenelder explains in this blog post:
The most important evidence we have for the existence of dark energy comes from supernova redshifts. Saul Perlmutter and Adam Riess won a Nobel Prize for this observation in 2011. . . . They used the distance inferred from the brightness and the redshift of type 1a supernovae, and found that the only way to explain both types of measurements is that the expansion of the universe is getting faster. And this means that dark energy must exist. 
Now, Perlmutter and Riess did their analysis 20 years ago and they used a fairly small sample of about 110 supernovae. Meanwhile, we have data for more than 1000 supernovae. For the new paper, the researchers used 740 supernovae from the JLA catalogue. But they also explain that if one just uses the data from this catalogue as it is, one gets a wrong result. The reason is that the data has been “corrected” already.

This correction is made because the story that I just told you about the redshift is more complicated than I made it sound. That’s because the frequency of light from a distant source can also shift just because our galaxy moves relative to the source. More generally, both our galaxy and the source move relative to the average restframe of stuff in the universe. And it is this latter frame that one wants to make a statement about when it comes to the expansion of the universe.

How do you even make such a correction? Well, you need to have some information about the motion of our galaxy from observations other than supernovae. You can do that by relying on regularities in the emission of light from galaxies and galaxy clusters. This allow astrophysicist to create a map with the velocities of galaxies around us, called the “bulk flow” . 
But the details don’t matter all that much. To understand this new paper you only need to know that the authors had to go and reverse this correction to get the original data. And then they fitted the original data rather than using data that were, basically, assumed to converge to the cosmological average. 
What they found is that the best fit to the data is that the redshift of supernovae is not the same in all directions, but that it depends on the direction. This direction is aligned with the direction in which we move through the cosmic microwave background. And – most importantly – you do not need further redshift to explain the observations. 
If what they say is correct, then it is unnecessary to postulate dark energy which means that the expansion of the universe might not speed up after all. 
Why didn’t Perlmutter and Riess come to this conclusions? 
They could not, because the supernovae that they looked were skewed in direction. The ones with low redshift were in the direction of the CMB dipole; and high redshift ones away from it. With a skewed sample like this, you can’t tell if the effect you see is the same in all directions. 
What is with the other evidence for dark energy? 
Well, all the other evidence for dark energy is not evidence for dark energy in particular, but for a certain combination of parameters in the concordance model of cosmology. These parameters include, among other things, the amount of dark matter, the amount of normal matter, and the Hubble rate. 
There is for example the data from baryon acoustic oscillations and from the cosmic microwave background which are currently best fit by the presence of dark energy. But if the new paper is correct, then the current best-fit parameters for those other measurements no longer agree with those of the supernovae measurements. This does not mean that the new paper is wrong. It means that one has to re-analyze the complete set of data to find out what is overall the combination of parameters that makes the best fit.
The post there concludes as follows:
This paper, I have to emphasize, has been peer reviewed, is published in a high quality journal, and the analysis meets the current scientific standard of the field. It is not a result that can be easily dismissed and it deserves to be taken very seriously, especially because it calls into question a Nobel Prize winning discovery. This analysis has of course to be checked by other groups and I am sure we will hear about this again, so stay tuned.

This is basically a rehash of an analysis previously published by an overlapping group of authors in 2016. A more technical discussion of the 2016 paper can be found at 4gravitons. 

There is also an unrelated major 2015 paper revealing a systemic issue involved in dark energy calculations, which is that there are basically two kinds of type Ia supernovae which have to be distinguished and that grouping them in one pool of data undermines its precision and reliability. I summarized that paper in the linked post as follows:
It turns out that there are two different subtypes of type 1a supernovas, with one more common in the early universe, and the other more common recently. They are very hard to distinguish in the visible light spectrum, but have clear differences in the UV spectrum. As a result, the rate at which the universe is expanding, if indeed it is expanding, and the amount of dark energy in the universe, are systemically overestimated by a significant amount. 
Less dark energy may, however, mean that another cosmology mystery is more profound. This could bring the relative amounts of dark matter and dark energy in the universe closer together, something that is already called the cosmic coincidence problem because there is no obvious theoretical reason for the two dark components of cosmology to be so similar in aggregate amount.
It isn't clear to me if the 2016 and 2019 paper account for the issue discovered in the 2015 paper. If they don't, the case for there not being net dark energy may be even stronger. 

1 comment:

andrew said...

A preprint seeking to rebut the argument against dark energy in these papers is at https://arxiv.org/abs/1912.02191