Sign up for the Starts With a Bang newsletter
Travel the universe with Dr. Ethan Siegel as he answers the biggest questions of all.
For a long time throughout the 20th century, the main goal of cosmology was twofold:
- to measure the expansion rate of the Universe today, known as the Hubble constant,
- and to measure how the expansion rate was changing over time, then known as the deceleration parameter.
After all, we had a law of gravity — Einstein’s General Relativity — that allowed us to calculate how the Universe would evolve based on the amount, density, and distribution of matter and energy within it. We observed the Universe to be roughly uniform in all locations and all directions, and we learned back in the 1920s and 1930s that the Universe was expanding. If we could just measure those two parameters, we thought, we’d know it all: the age, history, composition, and even the fate of our Universe.
Oh, if only we realized how little we knew back then. We had erroneously assumed that the Universe was 100% made up of matter, that the expansion rate would be decelerating, and that there wouldn’t be any surprises. Of course, there wound up being many, including the existence of dark energy and the uncovering of a more recent puzzle: the Hubble tension. Are they one and the same, though? That’s what Matt Williams wants to know, asking:
“Do the Hubble Tension and Dark Energy describe the same thing? Both appear to be based on observations that the expansion of the Universe has been speeding up in more recent cosmic history.”
We’re going to start with the surprises that we learned about the Universe in the late 20th century, and then, one at a time, we’ll come up to the puzzles we have here in 2025.

Edwin Hubble’s original plot of galaxy distances, from 1929, versus redshift (left), establishing the expanding Universe, versus a more modern counterpart from approximately 70 years later (right). Many different classes of objects and measurements are used to determine the relationship between distance to an object and its apparent speed of recession that we infer from its light’s relative redshift with respect to us. As you can see, from the very nearby Universe (lower left) to distant locations over a billion light-years away (upper right), this very consistent redshift-distance relation continues to hold. Earlier versions of Hubble’s graph were composed by Georges Lemaître (1927) and Howard Robertson (1928), using Hubble’s preliminary data.
Back in the 1970s and into the 1980s, a couple of new debates were raging over the Universe’s contents, properties, and birth. On the one hand, teams were working very hard to pin down the expansion rate of the Universe. The value had fallen over the century: from 600 km/s/Mpc as originally measured by Hubble to 300 a couple of decades later, and kept on getting smaller. The main players working on it, however, kept getting two incompatible answers:
- either they were right around 90-100 km/s/Mpc, implying a Universe only about 10 billion years old (younger than many individual stars and globular star clusters),
- or they were getting a value of 50-55 km/s/Mpc, implying a much older (~16 billion year), but heavily underdense Universe.
At the same time, copious evidence for dark matter had begun to emerge: from both the motions of galaxies within galaxy clusters and from the internal motions of individual galaxies. If the first camp was right, the Universe would be too young for the objects within it. If the second camp was right, there needed to be so little matter, overall, that dark matter — despite the clear evidence — would be an impossibility. For all that we had come to understand about our grand physical reality, something wasn’t adding up between the age, the expansion rate, and the contents of our Universe.

These data points, sorted by year, show different measurements of the expansion rate of the Universe using the cosmic distance ladder method, with the data points falling into two main groups: one clustered around 50 km/s/Mpc and one clustered around 100 km/s/Mpc. The results of the Hubble Key Project, released in 2001, are shown with the red bars.
Sure, in 1990, the Hubble space telescope was launched. It wasn’t named “Hubble” because, as many people think, it was named after Edwin Hubble and his many contributions to astronomy. No; it was named the Hubble space telescope because its main scientific purpose, as designed, was to pin down and measure the Hubble constant once and for all, and settle the debate over the Universe’s expansion rate. That result would come out in 2001, as the Wendy Freedman-led team pinned it down to be 72 km/s/Mpc, with an uncertainty of just around 10%. Both of the major sets of teams were wrong; the true value of the Hubble constant was actually in between the two.
But in the interim, something even more shocking had occurred. Two independent teams of astronomers, working largely with ground-based data of type Ia supernovae (collected primarily from North and South America, plus Australia), weren’t just interested in how fast the Universe was expanding (i.e., in measuring the slope of the distance vs. velocity graph for objects in the nearby Universe), but sought to measure how the expansion rate evolved over time. You see, it’s like this:
- If you want to measure the expansion rate today, you measure the slope of the velocity vs. distance graph in the nearby Universe.
- But if you want to measure what the different types of energy that are in the Universe are, you measure how that same graph curves — and no longer looks like a straight line — looking back billions of years into the past.
Using primarily ground-based data, with not even a handful of type Ia supernova observations from Hubble, those two teams both reached the same conclusion in 1998: the Universe wasn’t just expanding, but the expansion was accelerating, driven by a new form of energy.

Measuring back in time and distance (to the left of “today”) can inform how the Universe will evolve and accelerate/decelerate far into the future. By linking the expansion rate to the matter-and-energy contents of the Universe and measuring the expansion rate, we can come up with an estimate for the amount of time that’s passed since the start of the hot Big Bang. The supernova data in the late 1990s was the first set of data to indicate that we lived in a dark energy-rich Universe, rather than a matter-and-radiation dominated one; the data points, to the left of “today,” clearly drift from the standard “decelerating” scenario that had held sway through most of the 20th century.
That new form of energy was not consistent with being matter, radiation, neutrinos, antimatter, dark matter, or spatial curvature: the main forms of energy considered up to that point. Instead, this form of energy appeared to be most consistent with being a cosmological constant: a form of energy inherent to the fabric of space itself. This is what we call dark energy: the non-matter, non-radiation, non-neutrino, non-curvature, non-dark matter form of energy that isn’t just a part of our Universe, but actually represents the majority of the energy presently in the Universe. All of those other forms of energy, if they were to dominate the Universe, would lead to a distant galaxy — one that was initially receding amidst the expanding Universe — to slow down in its apparent recession speed from us.
But with dark energy, it appears to recede at ever-increasing speeds, faster and faster, as time goes on. That’s why we now talk about the accelerating Universe (and why we don’t say “deceleration parameter” anymore), and that dark energy is the cause of this cosmic acceleration. In the years since, we’ve worked very hard to pin down dark energy’s properties, measuring incredibly large numbers of type Ia supernovae at greater distances than ever. While it still appears to be consistent with a cosmological constant, when you combine that supernova data with both cosmic microwave background data and data from large-scale structure surveys, there is some (but not conclusive) evidence that dark energy might not be a constant, but may be (slightly) evolving over time.

This animation of DESI’s 3D map of the large-scale structure in the Universe, the largest such map to date, was created with the intention of studying dark energy and its possible evolution. However, although they found evidence for dark energy evolving, that’s likely due to the assumption that it’s dark energy’s evolution that’s causing the discrepancies in the data compared to our standard cosmological model. This is not necessarily the case.
That’s what dark energy is: it’s the form of energy that causes the Universe to accelerate, and that has come to dominate the Universe (as the matter and radiation densities have both continued to drop) for the past 6 billion years. With the knowledge that our Universe is made of approximately 70% dark energy, 25% dark matter, 5% normal matter, and a tiny bit of radiation and neutrinos, we’ve confidently now concluded that the Universe is 13.8 billion years old, and is expanding at around 70 km/s/Mpc. This is all consistent with the ages of the oldest stars we’ve ever found: upwards of 13 but less than 14 billion years. At last, you might think, the Universe makes sense.
Indeed, we do have a consistent picture of modern cosmology, what we call our standard (or ΛCDM) model, or the concordance model, of cosmology. Dark energy may be evolving — i.e., might not be the “Λ” (or Einstein’s cosmological constant) that we assumed it would be when we called it ΛCDM in the first place — but it definitely exists, definitely dominates the Universe, and definitely is causing the Universe’s expansion to accelerate. It wasn’t discovered with the Hubble space telescope, but in all the years since, the Hubble space telescope has been our best tool to identify and measure these white dwarfs that explode as type Ia supernova, enabling us to draw these fascinating conclusions. As always, this work continues to the present day.

This graph shows the 1550 supernovae that are a part of the Pantheon+ analysis, plotted as a function of magnitude versus redshift. The supernova data, for many decades now (ever since 1998), has pointed toward a Universe that expands in a particular fashion that requires something beyond matter, radiation, and/or spatial curvature: a new form of energy that drives the expansion, known as dark energy. The supernovae all fall along the line that our standard cosmological model predicts, with even the highest-redshift, most far-flung type Ia supernovae adhering to this simple relation. Calibrating the relation without substantial error is of paramount importance.
But starting in the late 2000s and early 2010s, a new, separate, and independent puzzle began to emerge: what we now call the Hubble tension. There isn’t just one, but many ways to measure the Universe’s expansion rate. Sure, we can do what astronomers have been doing since Hubble, Lemaître, and others have been doing for nearly a century:
- start here, in our own backyard,
- measure certain types of stars within our own galaxy and Local Group,
- determine the distance to those stars using straightforward methods like parallax,
- then measure stars of those same classes in nearby galaxies,
- while also measuring some other property of those galaxies (that can be seen in similar-type galaxies farther away),
- and then go measure more distant galaxies — at all distances — including that property that you measured nearby.
This method of starting here and gradually going farther and farther out, one observational method at a time, is what’s known as the “cosmic distance ladder” method, and has been leveraged successfully in measuring cosmic distances and recession speeds for nigh on 100 years.
The Hubble Key Project, who published their results that the Hubble constant was 72 km/s/Mpc back in 2001, used the distance ladder method to great effect. There are many ways to construct the cosmic distance ladder today, using a variety of approaches (or what astronomers call distance indicators) for the various “rungs,” but they all appear to be converging on the same value: around 73 km/s/Mpc. While most of these methods have significant uncertainties, individually, the best ones get their uncertainties down to just 1-2%. If that was all we had to look at, we’d have a fully consistent picture of the Universe. We’d be free and clear.

Modern measurement tensions from the distance ladder (red) with early signal data from the CMB and BAO (blue) shown for contrast. It is plausible that the early signal method is correct and there’s a fundamental flaw with the distance ladder; it’s plausible that there’s a small-scale error biasing the early signal method and the distance ladder is correct, or that both groups are right and some form of new physics (shown at top) is the culprit. The idea that there was an early form of dark energy is interesting, but that would imply more dark energy at early times, and that it has (mostly) since decayed away.
But we have a separate way to measure the expansion rate of the Universe: one that doesn’t use the distance ladder method at all. Because we understand the early history of our Universe so well — a Universe that started from the hot, dense, rapidly expanding state of the hot Big Bang, seeded by a spectrum of seed fluctuations from the preceding period of cosmic inflation — we know that some incredibly important features were imprinted into the cosmos from these early times. In particular, for the first few hundred thousand years of cosmic history, it was too hot to form neutral atoms, and all the charged particles (atomic nuclei and electrons) were in the state of an ionized plasma. Gravitation worked to attract matter into the overdense regions, but radiation streamed out of those regions, creating a series of “bounce” features in this primeval plasma.
Those features then show up at late times, imprinted in the Big Bang’s leftover glow (the cosmic microwave background, or CMB), as well as in the large-scale clustering of galaxies. By measuring the light from the CMB today, after the Universe has expanded for a further 13.8 billion years, as well as by measuring these acoustic oscillation features in the large-scale structure of the Universe, we can get an independent method for how the Universe has expanded over its history. Instead of starting nearby and looking back, we start at the Big Bang and evolve these key features forward. It’s not a distance ladder method, but rather an early relic method. And unlike the distance ladder value of around 73 km/s/Mpc, it gives a much lower value: of 67 km/s/Mpc, also with a tiny uncertainty of around 1%.

A 2023-era analysis of the various measurements for the expansion rate using distance ladder methods, dependent on which sample, which analysis, and which set of indicators are used. Note that the CCHP group, the only one to obtain a “low” value of the expansion rate, is only reporting statistical uncertainties, and does not quantify their systematic uncertainties at present. There is overwhelming consensus agreement that the expansion rate is around 73 km/s/Mpc using a wide variety of distance ladder methods.
That is what the “Hubble tension” actually is. Is it related to dark energy? Only tangentially. Yes, they both can leverage measurements of type Ia supernovae, because type Ia supernovae are the most precise distance indicator we have for distance ladder measurements and also because type Ia supernova measurements are so important for measuring dark energy. But if we ignored type Ia supernovae entirely, we have many other methods for measuring both:
- the expansion rate today, which leads to the Hubble tension problem,
- and the history of the expanding Universe at great distances, which is how we detect dark energy.
They are two different problems, both aided by observations from our powerful space telescopes like Hubble and JWST, and they both relate to the expanding Universe, but they are not the same problem.
This brings us up to two of the biggest questions for modern cosmology here in the present day: what is the nature of dark energy (is it a cosmological constant or not), and when it comes to the Hubble tension, what is the explanation behind why these two different classes of measurements lead to two entirely different, and mutually incompatible, results for how fast the Universe is expanding today? These are big questions, and while I can’t offer a solution to them (if I could, they’d probably give me a Nobel Prize; this is a current topic of research for thousands of astronomers and astrophysicists), I can tell you where we are today, and what prospects exist for solving these mysteries.

This graph shows a comparison between the value of H0, or the expansion rate today, as derived from Hubble space telescope Cepheids and anchors as well as other subsamples of JWST Cepheids (or other types of stars) and anchors. A comparison to Planck, which uses the early relic method instead of the distance ladder method, is also shown. Very clearly, the distance ladder and early relic methods do not yield mutually compatible results.
We know — and this much is not likely to change, based on the full suite of evidence we’ve gathered — that our Universe is:
- accelerating,
- dominated by dark energy,
- that dark energy either is or is very close to behaving like a cosmological constant,
- that in the Universe, 13.8 billion years (with ~1% uncertainty) have elapsed since the hot Big Bang,
- and that the Universe is currently expanding at approximately 70 km/s/Mpc.
However, there are many things, at the same time, that we aren’t sure of, and these uncertainties are themselves quite profound. We aren’t certain, for example:
- that dark energy is a constant,
- that other important, energy-containing components of the Universe haven’t evolved or decayed in non-standard ways,
- or that both camps, the 67 km/s/Mpc for the early relic method and the 73 km/s/Mpc for the distance ladder method, aren’t getting the correct value for the method they’re using.
It’s that last possibility that keeps me up at night, to be honest. After all, the CMB is definitively an early relic method, and yields that lower value with an uncertainty of no more than 1%. All distance ladder methods, when aggregated together, average out to a value of 73-74 km/s/Mpc, with the best of those methods having gotten down to a 1-2% uncertainty. If you add in the large-scale structure measurements to either the early relic (CMB) or the distance ladder (e.g.,, parallax, Cepheid, and type Ia supernova) results, they’re fully consistent with either one.
And yet, the two values are a full 9% different from one another, and if you try and put all three measurement data sets together, you’ll find that they are fundamentally not compatible with a single, in-between answer.

This fun graphic illustrates the tension on Λ, Einstein’s cosmological constant, exerted by combining supernova data (right), baryon acoustic oscillations (left), and the cosmic microwave background (top). When all three data sets are combined, the idea of a cosmological constant struggles to hold together; it’s possible that something, but perhaps not necessarily Λ, is going to give.
That’s why the Hubble tension is such a profound puzzle. Unlike the arguments over the expansion rate from some 50 years ago, the uncertainties we have today aren’t large; they’re not 10% or more (at just 1-sigma) of the total value; they’re only a few percent or even as low as 1%. People are no longer disagreeing over how they measure the same thing using the same method; people measuring the same things using the same methods get similar, consistent-with-one-another results. In fact, people using different “measuring sticks” that fall under the same class of method, either the distance ladder or the early relic method, still yield the same answers as one another.
But if you measure the same thing in fundamentally different ways, they should give you results that agree. For the Hubble constant, they don’t agree, and we don’t know why. Many have given reasons why “oh, maybe this one class of observations is biased or flawed,” and those reasons have all been disproven, one-by-one, as scientists have painstakingly established the robustness of what they’ve done.
So what’s the resolution going to be? New particles? New fields? An early form of one type of energy that decayed into another? A hitherto undiscovered, profound error?
We aren’t sure. Dark energy has been with us for over 25 years, and isn’t going anywhere, although it might not be the cosmological constant we commonly assume. And is there one, true expansion rate that can be applied equally, regardless of the method we used to measure it? It doesn’t seem to be, and that’s what the Hubble tension is. These two puzzles are both relevant to the fundamental properties and constituents of the Universe, but as far as the resolution goes, that will have to await our scientific future. For right now, we just don’t know.
Send in your Ask Ethan questions to startswithabang at gmail dot com!
Sign up for the Starts With a Bang newsletter
Travel the universe with Dr. Ethan Siegel as he answers the biggest questions of all.