Leonard Kelley holds a bachelor's in physics with a minor in mathematics. He loves the academic world and strives to constantly explore it.
For something that is all around us, the universe is quite elusive in revealing properties about itself. We must be expert detectives in regards to all the clues we have been given, carefully laying them out in hope of seeing some patterns. And sometimes, we run into contradictory information that struggles to be resolved. Take as a case in point the difficulty of determining the age of the Universe.
1929 was a landmark year for cosmology. Edwin Hubble, building upon the work of several scientists, was able to not only find the distance to far away objects with Cepheid Variables but also the apparent age of the universe. He noted that objects which were farther away had a higher redshift than objects closer to us. This is a property related to the Doppler shift, where the light of an object moving towards you is compressed and therefore blue-shifted but an object receding away has its light stretched out, shifting it to the red. Hubble was able to recognize this and noted that this observed pattern with redshift could only happen if the universe was experiencing an expansion. And if we play that expansion backwards like a movie then everything would condense to a single point, aka the Big Bang. By plotting the velocity that the redshift values indicate vs. the distance the object in question is, we can find the Hubble Constant Ho and from that value we can ultimately find the age of the universe. This is simply the time it has been since the Big Bang and is calculated as 1/Ho (Parker 67).
Distance Leads to Contradictions
Before it was determine that the universe’s expansion is accelerating, it was a strong possibility that it was in fact decelerating. If this was so, then the Hubble Time would act like a maximum and therefore lose its predictive power for the age of the universe. So to help make certain, we need lots of data on the distances away to objects, which will help refine the Hubble Constant and therefore compare different models of the universe, including the time aspect (68).
For his distance calculations, Hubble made use of Cepheids, which are well-renowned for their period-luminosity relation. Simply put, these stars vary in brightness in a periodic fashion. By calculating this period, you can find their absolute magnitude which when compared to its apparent magnitude gives us the distance to the object. By using this technique with close galaxies, we can compare them to similar ones that are too far away to have any discernable stars and by looking at the redshift one can find the approximate distance. But by doing this, we are extending a method onto another one. If something is wrong with the Cepheid ideology, then the distant galactic data is worthless (68).
And the results seemed to indicate this initially. When the redshifts came in from distant galaxies, it has a Ho of 526 kilometers per second-mega parsec (or km/(s*Mpc)), which translates to an age of 2 billion years for the universe. Geologists were quick to point out that even the Earth is older than that, based off carbon readings and other dating techniques from radioactive materials. Fortunately, Walter Baade of the Mt. Wilson Observatory was able to understand the discrepancy. Observations during World War II showed that stars could be separated into Population I vs. Population II. The former are hot and young with tons of heavy elements and can be located in the disc and arms of a galaxy, which promote star formation through gas compression. The latter are old and have few to no heavy elements and are located in the bulge of a galaxy as well as above and below the galactic plane (Ibid).
So how did this save Hubble’s method? Well, those Cepheid variables could belong to either of those classes of stars, which does affect the period-luminosity relationship. In fact, it revealed a new class of variable stars known as W Virginis variables. Taking this into account, the star classes were separated and a new Hubble Constant nearly half as big was found, leading to a universe nearly twice as old, still too little but a step in the right direction. Years later, Allan Sandage of Hale Observatories found that many of those supposed Cepheids Hubble used were actually star clusters. Removing these gave a new age of the universe at 10 billion years from a Hubble Constant of 10 km/(s*Mpc), and with the new technology of the time Sandage and Gustav A. Tannmann of Basil, Switzerland were able to arrive at a Hubble Constant of 50 km/(s*Mpc), and thus an age of 20 billion years (Parker 68-9, Naeye 21).
As it turns out, Cepheids had been assumed to have a strictly linear relation between the period and the luminosity. Even after Sandage removed the star clusters, a variation of a whole magnitude could be found from Cepheid to Cepheid based on data collected by Shapely, Nail, and other astronomers. 1955 even pointed to a likely non-linear relation when observations from globular clusters found a wide scatter. It was later shown that the team found over variable stars that were not Cepheid, but at the time they were even desperate enough to try and develop new math just to preserve their findings. And Sandage noted how new equipment would be able to further resolve Cepheids (Sandage 514-6).
However, others using modern equipment still arrived at a Hubble Constant value of 100 km/(s*Mpc), such as Marc Aarsonson of Steward Observatory, John Huchra of Harvard, and Jeremy Mould of Kitt Peak. In 1979, they arrived at their value by measuring the weight from rotation. As the mass of an object increases, the rate of rotation will also courtesy of conservation of angular momentum. And anything that moves towards/away from an object produces a Doppler effect. In fact, the easiest part of a spectrum to see a Doppler shift is the 21 centimeter line of hydrogen, whose width increases as the rate of rotation increases (for a larger displacement and stretching of the spectrum will occur during a receding motion). Based on the mass of the galaxy, a comparison between the measured 21 centimeter line and what it should be from the mass will help determine how far away the galaxy is. But for this to work, you need to be viewing the galaxy exactly edge on, otherwise some math models will be needed for a good approximation (Parker 69).
It was with this alternate technique that the aforementioned scientists pursued for their distance measurements. The galaxy the looked at was in Virgo and got an initial Ho value of 65 km/(s*Mpc), but when they looked in a different direction got a value of 95 km/(s*Mpc). What the heck!? Does the Hubble Constant depend on where you look? Gerard de Vaucouleurs looked at a ton of galaxies in the 50s and found that the Hubble Constant did fluctuate depending on where you looked, with small values being around the Virgo supercluster and the largest begin away. It was eventually determined that this was because of the mass of the cluster and the proximity to us misrepresenting the data (Parker 68, Naeye 21).
But of course, more teams have hunted down their own values. Wendy Freedman (University of Chicago) found her own reading in 2001 when she used data from the Hubble Space Telescope to examine Cepheids up to 80 million light-years away. With this as her launching point for her ladder, she made it up to 1.3 billion light-years away with her galaxy selection (for that around the time when the expansion of the Universe outpaced the speed of galaxies relative to one another). This lead her to a Ho of 72 km/(s*Mpc) with an error of 8 (Naeye 22).
The Supernova Ho for the Equation of State (SHOES), led by Adam Riess (Space Telescope Science Institute) added their name to the fray in 2018 with their Ho of 73.5 km/(s*Mpc) with only a 2.2% error. They used Type Ia supernova in conjunction with galaxies that contained Cepheids to get a better comparison. Also employed were eclipsing binaries in the Large Magellanic Cloud and water masers in galaxy M106. That is quite the data pool, leading to credence of the findings (Naeye 22-3).
Around the same time, the HoLiCOW (Hubble Constant Lenses in COSMOGRAIL's Wellspring) released their own findings. Their method employed gravitaonally lensed quasars, whose light was bent by the gravity of foreground objects like galaxies. This light undergoes different paths and therefore because of the known distance to the quasar offers a motion-detection system for seeing changes in the object and the delay it takes to travel each path. Using Hubble, the ESO/MPG 2.2 meter telescope, the VLT, and the Keck Observatory, the data points to an Ho of 73 km/(s*Mpc) with 2.24% error. Wow, that is very close to the SHOES results, which being a recent result with newer data points to a convincing result, as long as there is no overlap of the specific data used (Marsch).
Meanwhile, the Carnegie Supernova Project, led by Christopher Burns, found a similar finding of Ho being either 73.2 km/(s*Mpc) with 2.3% error or 72.7 km/(s*Mpc) with an error of 2.1%, depending on the wavelength filter used. They used the same data as SHOES but used a different calculation approach to analyzing the data, hence why the results are close but slightly different. However, if SHOES made an error then this would throw these results into question too. That is why a follow-up study with the Gaia spacecraft (which measured the parallax of over a billion stars) gives confidence in SHOES result. They found a value of 73.2 km/(s*Mpc) but with only an error of up to 1.8%, based on the parallax of 75 Cepheids used (Naeye 23, Wolchover "Astronomers").
And to complicate matters, a measurement has been found that is smack-dab in the middle of the two extremes we seem to face. Wendy Freedman led a new study using what are known as "tip of the red giant branch" or TRGB stars. That branch refers to the HR diagram, a useful visual that maps out star patterns based on size, color, and luminosity. TRGB stars are usually low in variability of data because it represents a short span of a star's life, meaning they give more conclusive values. Oftentimes, Cepheids are in dense regions of space and so have plenty of dust to obscure and potentially murk up the data. Critiques though say the data used was old and that the calibration techniques used to find results are unclear, so she redid both with new data and addressed the techniques. The value the team arrived at is 69.6 km/(s*Mpc) with roughly 2.5% error. This value is more in line with early universe values but is clearly differentiated from it too (Wolchover "New").
Read More From Owlcation
With so much disagreement over the Hubble Constant, can a lower bound be placed on the age of the universe? Indeed, it can, for parallax data from Hipparcos and simulations done by Chaboyer and team point to an absolute youngest possible age for globular clusters at 11.5 ± 1.3 billion years old. Many other sets of data went into the simulation including white dwarf sequence fitting, which compares the spectra of white dwarfs to ones we know their distance from parallax. By looking at how the light differs, we can gauge how far away the white dwarf is using out magnitude comparison and red shift data. Hipparcos came into this type of picture with its sub dwarf data, using the same ideas as the white dwarf sequence fitting but now with better data on this class of stars (and being able to remove binaries, not fully evolved stars, or suspected false signals helped matter tremendously) to find the distance to NGC 6752, M5, and M13 (Chaboyer 2-6, Reid 8-12).
The Hubble Tension
With all of this research seemingly providing no way to branch between the values spotted, scientists have dubbed this the Hubble tension. And it seriously puts into question our understanding of the Universe. Something has to be off either about how we think about the current Universe, the past one, or even both, yet our current modeling works so well that tweaking one thing would throw away the balance of what we have a good explanation for. What possibilities exist to resolve this new crisis in cosmology?
As the Universe has aged, space has expanded and has carried the objects contained in it further apart from one another. But galactic clusters actually have enough gravitational attraction to hold onto the member galaxies and prevent them from being dispersed throughout the Universe. So, as things have progressed along, the Universe has lost its homogenous status and is becoming more discrete, with 30-40 percent of space being clusters and 60-70% being voids between them. What this does is allow the voids to expand at a faster rate than homogenous space. Most models of the Universe fail to take this potential error source into account, so what happens when it is addressed? Krzysztof Bolejko (University of Tasmania) did a quick run-though of the mechanics in 2018 and found it to be promising, potentially altering the expansion by about 1% and thus putting models in sync. But a follow-up by Hayley J. Macpherson (University of Cambridge) and her team used a larger scale model, "the average expansion was virtually unchanged (Clark 37)."
The Cosmic Microwave Background
A different potential reason for all these discrepancies may lie in the Cosmic Microwave Background, or CMB. It has been interpreted by the Ho which itself stems from an evolving, not young, Universe. What should Ho be at such a time? Well, the Universe was more dense for starters, and that is why the CMB exists at all. Pressure waves, otherwise known as sound waves, traveled with great ease and resulted in changes to the density of the Universe which we measure today as microwave-stretched light. But these waves were impacted by residing baryonic and dark matter. WMAP and Planck both studied the CMB and from it derived a Universe of 68.3% dark energy, 26.8% dark matter, and 4.9% baryonic matter. From these values, we should expect Ho to be 67.4 km/(s*Mpc) with only 0.5% error! This is a wild deviation from the other values and yet the uncertainty is so low. This could be a hint for an evolving physics theory rather than a constant one. Maybe dark energy changes expansion differently than we expect it to, altering the constant in unpredictable ways. Space-time geometries might not be flat but curved, or it has some field properties we don't understand. Recent Hubble findings certainly point to something new being needed, for after examining 70 Cepheids in the Large Magellanic Cloud they were able to reduce the chance of error in Ho down to 1.3% (Naeye 24-6, Haynes).
Further results from the WMAP and Planck missions, which studied the CMB, place an age of 13.82 billion-years on the Universe, something that doesn’t disagree with the data. Can there be an error with these satellites? Do we need to look elsewhere for answers? We should certainly be prepared for that, for science is anything but static.
While its a very unappealing route, it may be time to ditch the prevailing lambda-CDM (dark energy with cold dark matter) and revise relativity to some new format. Bimetric gravity is one of thes possible new formats. In it, gravity has different equations which come into play whenever gravity is above or below a certain threshold. Edvard Mortsell (Stockholm University in Sweden) has been working on it and finds it appealing because if gravity's progress did change as the Universe progressed then expansion would be impacted. However, the issue in testing bimetric gravity is the equations themselves: They are just too difficult to solve (Clark 37)!
In the early 20th century, people were already modifiying relativity. Once of these approaches, pioneered by Elie Cartan, is known as torsion. Original relativity only accounts for mass considerations in space-time dynamics, but Cartan proposed that the spin of the matter and not just the mass should play a role also, being a fundamental property of the material in space-time. Torsion takes that into account and is a great launching point for modifying relativity because of the simplicity and reasonability in the revision. So far, early work shows that torsion can account for the discrepancies scientists have seen thus far but more work of course would be needed to verify anything (Clark 37-8).
Chaboyer, Brian and P. Demarque, Peter J, Kernan, Lawrence M. Krauss. “The Age of Globular Clusters in Light of Hipparcos: Resolving the Age Problem?” arXiv 9706128v3.
Clark, Stuart. "A quantum twist in space-time." New Scientist. New Scientist LTD., 28 Nov. 2020. Print. 37-8.
Haynes, Korey and Allison Klesman. "Hubble Confirms Universe's Fast Expansion Rate." Astronomy Sept. 2019. Print. 10-11.
Marsch, Ulrich. "New measurement of the universe's expansion rate strengthens call for new physics." innovations-report.com. innovations report, 09 Jan. 2020. Web. 28 Feb. 2020.
Naeye, Robert. "Tension at the Heart of Cosmology." Astronomy Jun. 2019. Print. 21-6.
Parker, Barry. “The Age of the Universe.” Astronomy Jul 1981: 67-71. Print.
Reid, Neill. “Globular Clusters, Hipparcos, and the Age of the Galaxy.” Proc. Natl. Acad. Sci. USA Vol. 95: 8-12. Print
Sandage, Allan. “Current Problems in the Extragalactic Distance Scale.” The Astrophysical Journal May 1958, Vol. 127, No. 3: 514-516. Print.
Wolchover, Natalie. "Astronomers Get Their Wish and a Cosmic Crisis Gets Worse." quantamagazine.com. Quanta, 17 Dec. 2020. Web. 17 Feb. 2021.
---. "New Wrinkle Added to Cosmology's Hubble Crisis." quantamagazine.com. Quanta, 26 Feb. 2020. Web. 20 Aug. 2020.
© 2016 Leonard Kelley