New data continues to refine the margin of error for the Hubble constant

The uncertainty of science: New data using the Webb Space Telescope’s spectroscopic capabilities has provided a more refined measure of the expansion rate of the universe, dubbed the Hubble constant.

According to previous research, that rate could be anywhere from 67.4 to 73.2 kilometers per second per megaparsecs, depending on whether you rely on data from the Planck orbiter or that of the Hubble Space Telescope. Though this difference appears reasonable considering the uncertainties and assumptions that go into research that determines both numbers, astronomers have been unhappy with the difference. The numbers should match and they don’t.

Now new data from Webb suggests this difference really is nothing more than the margin of error caused by the many uncertainties and assumptions involved. That new Webb data measured the Hubble constant using three different methods, all similar to that used by Hubble, and came up with 67.85, 67.96, and 72.04, all in the middle of the previous two numbers from Hubble and Planck.

In other words, all the data is beginning to fall within this margin of error.

Astronomers are without doubt still going to argue about this, but it does appear that the research is beginning to coalesce around an approximate number. More important, in terms of cosmology these results confirm the theory that the expansion of the universe is accelerating (dubbed “dark energy” simply because it needs a name), since they confirm the method used to measure that expansion rate in the very distant universe.

Keep your minds open however. There remain many questions and uncertainties with all these conclusions. Nothing is settled, nor will it be likely for decades if not centuries.

Hubble and Webb confirm decade-long conflict in universe’s expansion rate

The uncertainty of science: New data from both the Hubble and Webb space telescopes has confirmed Hubble’s previous measurement of the rate of the Hubble constant, the rate in which the universe is expanding. The problem is that these numbers still differ significantly from the expansion rate determined by the observations of the cosmic microwave background by the Planck space telescope.

Hubble and Webb come up with a rate of expansion 73 km/s/Mpc, while Planck found an expansion rate of 67 km/s/Mpc. Though this difference appears small, the scientists in both groups claim their margin of error is much smaller than that difference, which means both can’t be right.

You can read the paper for these new results here.

The bottom line mystery remains: The data is clearly telling us one of two things: 1) the many assumptions that go into these numbers might be incorrect, explaining the difference, or 2) there is something fundamentally wrong about the Big Bang theory that cosmologists have been promoting for more than a half century as the only explanation for the formation of the universe.

The solution could also be a combination of both. Our data and our theories are wrong.

The uncertainty of science as proven by the Webb Space Telescope

A long detailed article was released today at Space.com, describing the many contradictions in the data coming back from the Webb Space Telescope that seriously challenge all the theories of cosmologists about the nature of the universe as well as its beginning in a single Big Bang.

The article is definitely worth reading, but be warned that it treats science as a certainty that should never have such contradictions, as illustrated first by its very headline: “After 2 years in space, the James Webb Space Telescope has broken cosmology. Can it be fixed?”

“Science” isn’t broken in the slightest. All Webb has done is provide new data that does not fit the theories. As physicist Richard Feynman once stated bluntly in teaching students the scientific method,

“It doesn’t make a difference how beautiful your guess is, it doesn’t make a difference how smart you are, who made the guess, or what his name is. If it disagrees with experiment, it’s wrong.”

Cosmologists for decades have been guessing in proposing their theories about the Big Bang, the expansion of the universe, and dark matter, based on only a tiny amount of data that had been obtained with enormous assumptions and uncertainties. It is therefore not surprising (nor was it ever surprising) that Webb has blown holes in their theories.

For example, the article spends a lot of time discussing the Hubble constant, describing how observations using different instruments (including Webb) have come up with two conflicting numbers for it — either 67 or 74 kilometers per second per megaparsec. No one can resolve this contradiction. No theory explains it.

To me the irony is that back in the 1990s, when Hubble made its first good measurements of the Hubble constant, these same scientists were certain then that the number Hubble came up with, around 90 kilometers per second per megaparsec, was now correct.

They didn’t really understand reality then, and they don’t yet understand it now.

What cosmologists must do is back away from their theories and recognize the vast areas of ignorance that exist. Once that is done, they might have a chance to resolve the conflict between the data obtained and the theories proposed, and come up with new theories that might work (with great emphasis on the word “might”). Complaining about the paradoxes will accomplish nothing.

Astronomers make first radio observations of key type of supernova

The uncertainty of science: Using a variety of telescopes, astronomers have not only made the first radio observations of key type of supernova, they have also detected helium in the data, suggesting that this particular supernova of that type was still atypical.

This marks the first confirmed Type Ia supernova triggered by a white dwarf star that pulled material from a companion star with an outer layer consisting primarily of helium; normally, in the rare cases where the material stripped from the outer layers of the donor star could be detected in spectra, this was mostly hydrogen.

Type Ia supernovae are important for astronomers since they are used to measure the expansion of the universe. However, the origin of these explosions has remained an open question. While it is established that the explosion is caused by a compact white dwarf star that somehow accretes too much matter from a companion star, the exact process and the nature of the progenitor is not known. [emphasis mine]

The highlighted sentences are really the most important take-away from this research. Type Ia supernovae were the phenomenon used by cosmologists to detect the unexpected acceleration of the universe’s expansion billions of years ago. That research assumed these supernovae were well understood and consistently produced the same amount of energy and light, no matter how far away they were or the specific conditions which caused them.

This new supernovae research illustrates how absurd that assumption was. Type Ia supernovae are produced by the interaction of two stars, both of which could have innumerable unique features. It is therefore unreasonable as a scientist to assume all such supernovae are going to be identical in their output. And yet, that is what the cosmologists did in declaring the discovery of dark energy in the late 1990s.

It is also what the scientists who performed this research do. To quote one of the co-authors: “While normal Type Ia supernovae appear to always explode with the same brightness, this supernova tells us that there are many different pathways to a white dwarf star explosion.”

Forgive me if I remain very skeptical.

Conflict in Hubble constant continues to confound astronomers

The uncertainty of science: In reviewing their measurements of the Hubble constant using a variety of proxy distance tools, such as distant supernovae, astronomers recently announced that their numbers must be right, even though those numbers do not match the Hubble constant measured using completely different tools.

Most measurements of the current acceleration of the universe (called the Hubble constant, or H0) based on stars and other objects relatively close to Earth give a rate of 73 km/s/Mpc. These are referred to as “late-time” measurements [the same as confirmed by the astronomers in the above report]. On the other hand, early-time measurements, which are based on the cosmic microwave background emitted just 380,000 years after the Big Bang, give a smaller rate of 68 km/s/Mpc.

They can’t both be right. Either something is wrong with the standard cosmological model for our universe’s evolution, upon which the early-time measurements rest, or something is wrong with the way scientists are working with late-time observations.

The astronomers are now claiming that their late-time observations must be right, which really means there is either something about the present theories about the Big Bang that are fundamentally wrong and that our understanding of early cosmology is very incomplete, or the measurements by everyone are faulty.

Based on the number of assumptions used with both measurements, it is not surprising the results don’t match. Some of those assumptions are certainly wrong, but to correct the error will require a lot more data that will only become available when astronomers have much bigger telescopes of all kinds, in space and above the atmosphere. Their present tools on Earth are insufficient for untangling this mystery.

An astrophysicist explains cosmology’s theoretical failures

Link here. The astrophysicist, Paul Sutter, does a very nice job of outlining the conundrum that has been causing astrophysicists to tear their hair out for the past decade-plus.

In the two decades since astronomers discovered dark energy, we’ve come upon a little hitch: Measurements of the expansion rate of the universe (and so its age) from both the CMB [cosmic microwave background] and supernovas have gotten ever more precise, but they’re starting to disagree. We’re not talking much; the two methods are separated by only 10 million or 20 million years in estimating the 13.77-billion-year history of the universe. But we’re operating at such a level of precision that it’s worth talking about.

If anything, this failure for two measurements of data spanning billions of light years — which is billions in both time and space — is a perfect illustration of the uncertainty of science. Astrophysicists are trying to come up with answers based on data that is quite thin, with many gaps in knowledge, and carries with it many assumptions. It therefore is actually surprising that these two numbers agree as well as they do.

Sutter, being in the CMB camp, puts most of the blame for this failure on the uncertainty of what we know about supernovae. He could very well be right. The assumptions about supernovae used to measure the expansion rate of the universe are many. There is also a lot of gaps in our knowledge, including a full understanding of the process that produces supernovae.

Sutter however I think puts too much faith in theoretical conclusions of the astrophysics community that have determined the age of the universe based on the CMB. The uncertainties here are as great. Good scientists should remain skeptical of this as well. Our knowledge of physics is still incomplete. Physicists really don’t know all the answers, yet.

In the end, Sutter however does pin down the biggest problem in cosmology:

The “crisis” is a good excuse to keep writing papers, because we’ve been stumped by dark energy for over two decades, with a lot of work and not much understanding. In a sense, many cosmologists want to keep the crisis going, because as long as it exists, they have something to talk about other than counting down the years to the next big mission.

In other words, the discussion now is sometimes less about science and theories and cosmology, but instead about funding and career promotion. What a shock!

Universe’s expansion rate found to differ in different directions

The uncertainty of science: Using data from two space telescopes, astronomers have found that the universe’s expansion rate appears to differ depending on the direction you look.

This latest test uses a powerful, novel and independent technique. It capitalizes on the relationship between the temperature of the hot gas pervading a galaxy cluster and the amount of X-rays it produces, known as the cluster’s X-ray luminosity. The higher the temperature of the gas in a cluster, the higher the X-ray luminosity is. Once the temperature of the cluster gas is measured, the X-ray luminosity can be estimated. This method is independent of cosmological quantities, including the expansion speed of the universe.

Once they estimated the X-ray luminosities of their clusters using this technique, scientists then calculated luminosities using a different method that does depend on cosmological quantities, including the universe’s expansion speed. The results gave the researchers apparent expansion speeds across the whole sky — revealing that the universe appears to be moving away from us faster in some directions than others.

The team also compared this work with studies from other groups that have found indications of a lack of isotropy using different techniques. They found good agreement on the direction of the lowest expansion rate.

More information here.

The other research mentioned in the last paragraph in the quote above describes results posted here in December. For some reason that research did not get the publicity of today’s research, possibly because it had not yet been confirmed by others. It now has.

What this research tells us, most of all, is that dark energy, the mysterious force that is theorized to cause the universe’s expansion rate to accelerate — not slow down as you would expect– might not exist.

Update: I’ve decided to embed, below the fold, the very clear explanatory video made by one of the scientists doing that other research. Very helpful in explaining this very knotty science.

New estimate for Hubble constant differs from previous and also conflicting results

The uncertainty of science: Using microlensing effects scientists have measured a new estimate for the Hubble constant, the rate in which the universe is expanding, and have come up with a number that is different from previous results.

Using adaptive optics technology on the W.M. Keck telescopes in Hawaii, they arrived at an estimate of 76.8 kilometers per second per megaparsec. As a parsec is a bit over 30 trillion kilometers and a megaparsec is a million parsecs, that is an excruciatingly precise measurement. In 2017, the H0LICOW team published an estimate of 71.9, using the same method and data from the Hubble Space Telescope.

The new SHARP/H0LICOW estimates are comparable to that by a team led by Adam Reiss of Johns Hopkins University, 74.03, using measurements of a set of variable stars called the Cepheids. But it’s quite a lot different from estimates of the Hubble constant from an entirely different technique based on the cosmic microwave background. That method, based on the afterglow of the Big Bang, gives a Hubble constant of 67.4, assuming the standard cosmological model of the universe is correct.

An estimate by Wendy Freedman and colleagues at the University of Chicago comes close to bridging the gap, with a Hubble constant of 69.8 based on the luminosity of distant red giant stars and supernovae.

So five different teams have come up with five different numbers, ranging from 67.4 to 76.8 kilometers per second per megaparsec. Based on the present understanding of cosmology, however, the range should have been far less. By now the physicists had expected these different results to be close to the same. The differences suggest that either their theories are wrong, or their methods of measurement are incorrect.

The most likely explanation is that we presently have too little knowledge about the early universe to form any solid theories. These measurements are based on a very tiny amount of data that also require a lot of assumptions.

New Hubble data baffles cosmologists about universe’s expansion rate

The uncertainty of science: New and very firm data from the Hubble Space Telescope on the universe’s expansion rate conflicts with just-as-firm data obtained by Europe’s Planck astronomical probe.

According to Planck, the present universe should be expanding at a rate of 67 kilometers per second per megaparsec. According to Hubble, the actual expansion rate is 74 kilometers per second per megaparsec.

And according to the scientists involved, both data sets are reliable and trustworthy, leaving them baffled at the difference.

“This is not just two experiments disagreeing,” explained [lead researcher and Nobel laureate Adam Riess of the Space Telescope Science Institute (STScI) and Johns Hopkins University, in Baltimore, Maryland.] “We are measuring something fundamentally different. One is a measurement of how fast the universe is expanding today, as we see it. The other is a prediction based on the physics of the early universe and on measurements of how fast it ought to be expanding. If these values don’t agree, there becomes a very strong likelihood that we’re missing something in the cosmological model that connects the two eras.”

Ya think? Any cosmologist who claims we really understand what is going on, based on our present fragile and very limited knowledge, is either fooling him or herself or is trying to fool us.

I should note that there seems to be an effort, based on the press release above as well as this second one, to downplay the amount of uncertainties that exist in this cosmological work. Both releases fail to note that when scientists announced their first expansion rate estimate from Hubble’s first data back in 1995, those scientists claimed with absolute certainty that the expansion rate was 80 kilometers per second per megaparsec. At the time some scientists, led by the late Allan Sandage of the Carnegie Observatory, disputed this high number, claiming the number could be as low as 50. Some even said it could be as low as 30 kilometers. Sandage especially found himself poo-pooed by the cosmological community for disputing that the 80 number pushed by Hubble’s scientists in 1995.

In the end, the Hubble scientists in 1995 were closer to today’s Hubble number than Sandage, but his estimate was not wrong by that much more, and he was right when he said the number had to be lower. Either way, Hubble’s modern estimate of 74 for the present expansion rate is very well constrained, and is a far more trustworthy number than previous estimates.

However, do we know with any reliability what the expansion rate was billions of years ago? No. Cosmologists think it was faster, based on supernovae data, which is where the theory of dark energy comes from. It is also where Planck’s predictions come from.

That early expansion rate, however, is based on such tentative data, containing so many assumptions and such large margins of error, that no serious scientist should take it too seriously. It suggests things, but it certainly doesn’t confirm them.

This is why their faith in the numbers derived from Planck puzzles me. It is based on a “prediction,” as Riess admits in the quote above, which means it is based on a theoretical model. It is also based on that very tentative early supernovae data, which makes it very suspect to me.

The Hubble data is real data, obtained by looking at nearby stars in a very precise matter. Its margin of error is very small. It contains only a few assumptions, mostly involving our understanding of the Cepheid variable stars that Hubble observed. While skepticism is always called for, trusting this new Hubble data more is perfectly reasonable.

In the end, to really solve this conflict will require better data from the early universe. Unfortunately, that is not something that will be easy to get.

The unfinished search for the Hubble constant

The uncertainty of science: Scientists continue to struggle in their still unfinished search for determining the precise expansion rate for the universe, dubbed the Hubble constant in honor of Edwin Hubble, who discovered that expansion.

The problem is, the values obtained from [two different] methods do not agree—a discrepancy cosmologists call “tension.” Calculations from redshift place the figure at about 73 (in units of kilometers per second per megaparsec); the CMB estimates are closer to 68. Most researchers first thought this divergence could be due to errors in measurements (known among astrophysicists as “systematics”). But despite years of investigation, scientists can find no source of error large enough to explain the gap.

I am especially amused by these numbers. Back in 1995 NASA had a big touted press conference to announce that new data from the Hubble Space Telescope had finally determined the exact number for the Hubble constant, 80 (using the standard above). The press went hog wild over this now “certain” conclusion, even though other astronomers disputed it, and offered lower numbers ranging from 30 to 65. Astronomer Allan Sandage of the Carnegie Observatories was especially critical of NASA’s certainty, and was dully ignored by most of the press.

In writing my own article about this result, I was especially struck during my phone interview with Wendy Friedman, the lead scientist for Hubble’s results, by her own certainty. When I noted that her data was very slim, the measurements of only a few stars from one galaxy, she poo-pooed this point. Her result had settled the question!

I didn’t buy her certainty then, and in my article, for The Sciences and entitled most appropriately “The Hubble Inconstant”, made it a point to note Sandage’s doubts. In the end it turns out that Sandage’s proposed number then of between 53 and 65 was a better prediction.

Still, the science for the final number remains unsettled, with two methods coming up with numbers that are a little less than a ten percent different, and no clear explanation for that difference. Isn’t science wonderful?

Conflict in Hubble constant increases with new data from Hubble and Gaia

The uncertainty of science: New data from the Hubble Space Telescope and Gaia continues to measure a different Hubble constant for the expansion rate of the universe, when compared with data from the Planck space telescope.

Using Hubble and newly released data from Gaia, Riess’ team measured the present rate of expansion to be 73.5 kilometers (45.6 miles) per second per megaparsec. This means that for every 3.3 million light-years farther away a galaxy is from us, it appears to be moving 73.5 kilometers per second faster. However, the Planck results predict the universe should be expanding today at only 67.0 kilometers (41.6 miles) per second per megaparsec. As the teams’ measurements have become more and more precise, the chasm between them has continued to widen, and is now about 4 times the size of their combined uncertainty.

The problem really is very simple: We haven’t the faintest idea what is going on. We have some data, but we also have enormous gaps in our knowledge of the cosmos. Moreover, most of our cosmological data is reliant on too many assumptions that could be wrong, or simply in error. And the errors can be tiny and still throw the results off by large amounts.

The one thing that good science and skepticism teaches is humbleness. Do not be too sure of your conclusions. The universe is a large and complex place. It likes to throw curve balls at us, and if we swing too soon we will certainly miss.

Hubble finds new figure for universe expansion rate

The uncertainty of science: Using data from the Hubble Space Telescope astronomers have found evidence that universe’s expansion rate is faster than estimated in previous measurements.

The new findings show that eight Cepheid variables in our Milky Way galaxy are up to 10 times farther away than any previously analyzed star of this kind. Those Cepheids are more challenging to measure than others because they reside between 6,000 and 12,000 light-years from Earth. To handle that distance, the researchers developed a new scanning technique that allowed the Hubble Space Telescope to periodically measure a star’s position at a rate of 1,000 times per minute, thus increasing the accuracy of the stars’ true brightness and distance, according to the statement.

The researchers compared their findings to earlier data from the European Space Agency’s (ESA) Planck satellite. During its four-year mission, the Planck satellite mapped leftover radiation from the Big Bang, also known as the cosmic microwave background. The Planck data revealed a Hubble constant between 67 and 69 kilometers per second per megaparsec. (A megaparsec is roughly 3 million light-years.)

However, the Planck data gives a constant about 9 percent lower than that of the new Hubble measurements, which estimate that the universe is expanding at 73 kilometers per second per megaparsec, therefore suggesting that galaxies are moving faster than expected, according to the statement.

“Both results have been tested multiple ways, so barring a series of unrelated mistakes, it is increasingly likely that this is not a bug but a feature of the universe,” Riess said. [emphasis mine]

I should point out that one of the first big results from Hubble in 1995 (which also happened to be the subject one of my early published stories), the estimate then for the Hubble constant was 80 kilometers per second per megaparsec. At the time, the astronomers who did the research were very certain they had it right. Others have theorized that the number could be as low as 30 kilometers per second per megaparsec.

What is important about this number is that it determines how long ago the Big Bang is thought to have occurred. Lower numbers mean it took place farther in the past. Higher numbers mean the universe is very young.

That scientists keep getting different results only suggests to me that they simply do not yet have enough data to lock the number down firmly.

“One of the greatest discoveries of the century is based on these things and we don’t even know what they are, really.”

The uncertainty of science: New research suggests that astronomers have little understanding of the supernovae that they use to estimate the distance to most galaxies, estimates they then used to discover dark energy as well as measure the universe’s expansion rate.

The exploding stars known as type Ia supernovae are so consistently bright that astronomers refer to them as standard candles — beacons that are used to measure vast cosmological distances. But these cosmic mileposts may not be so uniform. A new study finds evidence that the supernovae can arise by two different processes, adding to lingering suspicions that standard candles aren’t so standard after all.

The findings, which have been posted on the arXiv preprint server and accepted for publication in the Astrophysical Journal, could help astronomers to calibrate measurements of the Universe’s expansion. Tracking type Ia supernovae showed that the Universe is expanding at an ever-increasing rate, and helped to prove the existence of dark energy — advances that secured the 2011 Nobel Prize in Physics.

The fact that scientists don’t fully understand these cosmological tools is embarrassing, says the latest study’s lead author, Griffin Hosseinzadeh, an astronomer at the University of California, Santa Barbara. “One of the greatest discoveries of the century is based on these things and we don’t even know what they are, really.”

The key to understanding this situation is to maintain a healthy skepticism about any cosmological theory or discovery, no matter how enthusiastically touted by the press and astronomers. The good astronomers do not push these theories with great enthusiasm as they know the feet of clay on which they stand. The bad ones try to use the ignorant mainstream press to garner attention, and thus funding.

For the past two decades the good astronomers have been diligently checking and rechecking the data and the supernovae used to discover dark energy. Up to now this checking seems to still suggest the universe’s expansion is accelerating on large scales. At the same time, our knowledge of supernovae remains sketchy, and thus no one should assume we understand the universe’s expansion rate with any confidence.

Expansion rate of the universe might not be accelerating

The uncertainty of science: A new review of the data suggests that the expansion of the universe might not be accelerating as posited based on research done in the 1990s.

Making use of a vastly increased data set – a catalogue of 740 Type Ia supernovae, more than ten times the original sample size – the researchers have found that the evidence for acceleration may be flimsier than previously thought, with the data being consistent with a constant rate of expansion.

The study is published in the Nature journal Scientific Reports.

Professor Sarkar, who also holds a position at the Niels Bohr Institute in Copenhagen, said: ‘The discovery of the accelerating expansion of the universe won the Nobel Prize, the Gruber Cosmology Prize, and the Breakthrough Prize in Fundamental Physics. It led to the widespread acceptance of the idea that the universe is dominated by “dark energy” that behaves like a cosmological constant – this is now the “standard model” of cosmology.

‘However, there now exists a much bigger database of supernovae on which to perform rigorous and detailed statistical analyses. We analysed the latest catalogue of 740 Type Ia supernovae – over ten times bigger than the original samples on which the discovery claim was based – and found that the evidence for accelerated expansion is, at most, what physicists call “3 sigma”. This is far short of the 5 sigma standard required to claim a discovery of fundamental significance.

I am not surprised. In fact, I remain continually skeptical about almost all cosmological theories. They might be the best we have, based on the facts available, but they are also based upon incredibly flimsy facts.

Universe’s expansion rate contradicts dark energy data

The uncertainty of science: New measurements of the universe’s expansion rate, dubbed the Hubble constant, contradict theoretical predictions based on previous data.

For their latest paper, Riess’s team studied two types of standard candles in 18 galaxies using hundreds of hours of observing time on the Hubble Space Telescope. “We’ve been going gangbusters with this,” says Riess.

Their paper, which has been submitted to a journal and posted on the arXiv online repository on 6 April, reports that they measured the constant with an uncertainty of 2.4%, down from a previous best result2 of 3.3%. They find the speed of expansion to be about 8% faster than that predicted based on Planck data, says Riess. [emphasis mine]

I highlight the number of galaxies used to get this data because I think these scientists, are being a bit over-confident about the uncertainty of their data. The universe has untold trillions of galaxies. To say they have narrowed their uncertainty down to only 2.4% based on 18 is the height of silliness.

But then, the lead scientist, Adam Riess, recognizes this, as he is also quoted in the article saying “I think that there is something in the standard cosmological model that we don’t understand.”

Using data from the Spitzer Space Telescope astronomers have narrowed the universe’s rate of expansion to about 74.3 kilometers per second per megaparsec.

The uncertainty of science: Using data from the Spitzer Space Telescope astronomers have narrowed the universe’s rate of expansion to about 74.3 kilometers per second per megaparsec.

The importance of this number, also called the Hubble Constant, is that it allows astronomers to extrapolate more precisely backward to when they believe the Big Bang occurred, about 13.7 billion years ago. It also is a crucial data point in their effort to understand dark energy, in which this expansion rate is actually accelerating on vast scales.

Back in 1995 a team led by Wendy Freedman, the same scientist leading the work above, announced that they had used the Hubble Space Telescope to determine the expansion rate as 80 kilometers per second per megaparsec. Then, the margin of error was plus or minus 17 kilometers. Now the margin of error has been narrowed to plus or minus 2.1 kilometers.

Do I believe these new numbers? No, not really. Science has nothing to do with belief. I do think this is good science, however, and that this new estimate of the Hubble constant is probably the best yet. I would also not be surprised if in the future new data eventually proves this estimate wrong.