The uncertainty of science as proven by the Webb Space Telescope

A long detailed article was released today at Space.com, describing the many contradictions in the data coming back from the Webb Space Telescope that seriously challenge all the theories of cosmologists about the nature of the universe as well as its beginning in a single Big Bang.

The article is definitely worth reading, but be warned that it treats science as a certainty that should never have such contradictions, as illustrated first by its very headline: “After 2 years in space, the James Webb Space Telescope has broken cosmology. Can it be fixed?”

“Science” isn’t broken in the slightest. All Webb has done is provide new data that does not fit the theories. As physicist Richard Feynman once stated bluntly in teaching students the scientific method,

“It doesn’t make a difference how beautiful your guess is, it doesn’t make a difference how smart you are, who made the guess, or what his name is. If it disagrees with experiment, it’s wrong.”

Cosmologists for decades have been guessing in proposing their theories about the Big Bang, the expansion of the universe, and dark matter, based on only a tiny amount of data that had been obtained with enormous assumptions and uncertainties. It is therefore not surprising (nor was it ever surprising) that Webb has blown holes in their theories.

For example, the article spends a lot of time discussing the Hubble constant, describing how observations using different instruments (including Webb) have come up with two conflicting numbers for it — either 67 or 74 kilometers per second per megaparsec. No one can resolve this contradiction. No theory explains it.

To me the irony is that back in the 1990s, when Hubble made its first good measurements of the Hubble constant, these same scientists were certain then that the number Hubble came up with, around 90 kilometers per second per megaparsec, was now correct.

They didn’t really understand reality then, and they don’t yet understand it now.

What cosmologists must do is back away from their theories and recognize the vast areas of ignorance that exist. Once that is done, they might have a chance to resolve the conflict between the data obtained and the theories proposed, and come up with new theories that might work (with great emphasis on the word “might”). Complaining about the paradoxes will accomplish nothing.

Despite good first images from Euclid, the orbiting telescope has a problem

Even though the first light images from Euclid have been sharp and exactly what astronomers want, the orbiting telescope designed to make a 3D map of billions of galaxies has an issue that will likely put some limits to that map.

When the telescope started booting up, ESA observers were concerned by the appearance of light markings on the first images relayed to Earth. This, it confirmed, was due to sunlight filtering into the telescope, “probably through a tiny gap”.

A correction to Euclid’s position was able to offset this issue. It means that while the ESA is confident Euclid will be fine to proceed with its mapping mission, particular orientations for the telescope may not be possible.

A limitation like this means that the telescope will not being able to look in some directions and get mapping images. Thus, the overall map will have gaps, though it appears at this moment that the scientists think those gaps will not seriously impact the telescope’s overall work. We shall see.

Euclid’s first images look good

Scientists have determined that the first test images from the two cameras on the recently launched orbiting Euclid space telescope are sharp and as expected.

Both VIS and NISP provided these unprocessed raw images. Compared to commercial products, the cameras are immensely more complex. VIS comprises 36 individual CCDs with a total of 609 megapixels and produces high-resolution images of billions of galaxies in visible light. This is how astronomers determine their shape. The first images already give an impression of the abundance that the data will provide.

NISP’s detector consists of 16 chips with a total of 64 megapixels. It operates in the near-infrared at wavelengths between 1 and 2 microns. In addition, NISP serves as a spectrograph, which splits the light of the captured objects similar to a rainbow and allows for a finer analysis. These data will allow the mapping of the three-dimensional distribution of galaxies.

Knowing that 3D distribution will allow scientists to better determine the nature of both dark energy (related to the acceleration of the universe’s expansion) and dark matter (related to an undiscovered mass that affects the formation and shape of galaxies).

Astronomers make first radio observations of key type of supernova

The uncertainty of science: Using a variety of telescopes, astronomers have not only made the first radio observations of key type of supernova, they have also detected helium in the data, suggesting that this particular supernova of that type was still atypical.

This marks the first confirmed Type Ia supernova triggered by a white dwarf star that pulled material from a companion star with an outer layer consisting primarily of helium; normally, in the rare cases where the material stripped from the outer layers of the donor star could be detected in spectra, this was mostly hydrogen.

Type Ia supernovae are important for astronomers since they are used to measure the expansion of the universe. However, the origin of these explosions has remained an open question. While it is established that the explosion is caused by a compact white dwarf star that somehow accretes too much matter from a companion star, the exact process and the nature of the progenitor is not known. [emphasis mine]

The highlighted sentences are really the most important take-away from this research. Type Ia supernovae were the phenomenon used by cosmologists to detect the unexpected acceleration of the universe’s expansion billions of years ago. That research assumed these supernovae were well understood and consistently produced the same amount of energy and light, no matter how far away they were or the specific conditions which caused them.

This new supernovae research illustrates how absurd that assumption was. Type Ia supernovae are produced by the interaction of two stars, both of which could have innumerable unique features. It is therefore unreasonable as a scientist to assume all such supernovae are going to be identical in their output. And yet, that is what the cosmologists did in declaring the discovery of dark energy in the late 1990s.

It is also what the scientists who performed this research do. To quote one of the co-authors: “While normal Type Ia supernovae appear to always explode with the same brightness, this supernova tells us that there are many different pathways to a white dwarf star explosion.”

Forgive me if I remain very skeptical.

An astrophysicist explains cosmology’s theoretical failures

Link here. The astrophysicist, Paul Sutter, does a very nice job of outlining the conundrum that has been causing astrophysicists to tear their hair out for the past decade-plus.

In the two decades since astronomers discovered dark energy, we’ve come upon a little hitch: Measurements of the expansion rate of the universe (and so its age) from both the CMB [cosmic microwave background] and supernovas have gotten ever more precise, but they’re starting to disagree. We’re not talking much; the two methods are separated by only 10 million or 20 million years in estimating the 13.77-billion-year history of the universe. But we’re operating at such a level of precision that it’s worth talking about.

If anything, this failure for two measurements of data spanning billions of light years — which is billions in both time and space — is a perfect illustration of the uncertainty of science. Astrophysicists are trying to come up with answers based on data that is quite thin, with many gaps in knowledge, and carries with it many assumptions. It therefore is actually surprising that these two numbers agree as well as they do.

Sutter, being in the CMB camp, puts most of the blame for this failure on the uncertainty of what we know about supernovae. He could very well be right. The assumptions about supernovae used to measure the expansion rate of the universe are many. There is also a lot of gaps in our knowledge, including a full understanding of the process that produces supernovae.

Sutter however I think puts too much faith in theoretical conclusions of the astrophysics community that have determined the age of the universe based on the CMB. The uncertainties here are as great. Good scientists should remain skeptical of this as well. Our knowledge of physics is still incomplete. Physicists really don’t know all the answers, yet.

In the end, Sutter however does pin down the biggest problem in cosmology:

The “crisis” is a good excuse to keep writing papers, because we’ve been stumped by dark energy for over two decades, with a lot of work and not much understanding. In a sense, many cosmologists want to keep the crisis going, because as long as it exists, they have something to talk about other than counting down the years to the next big mission.

In other words, the discussion now is sometimes less about science and theories and cosmology, but instead about funding and career promotion. What a shock!

Rethinking the theories that explain some supernovae

The uncertainty of science: New data now suggests that the previous consensus among astronomers that type Ia supernovae were caused by the interaction of a large red giant star with a white dwarf might be wrong, and that instead the explosion might be triggered by two white dwarfs.

If this new origin theory turns out to be correct, then it might also throw a big wrench into the theory of dark energy.

The evidence that twin white dwarfs drive most, if not all, type Ia supernovae, which account for about 20% of the supernova blasts in the Milky Way, “is more and more overwhelming,” says Dan Maoz, director of Tel Aviv University’s Wise Observatory, which tracks fast-changing phenomena such as supernovae. He says the classic scenario of a white dwarf paired with a large star such as a red giant “doesn’t happen in nature, or quite rarely.”

Which picture prevails has impacts across astronomy: Type Ia supernovae play a vital role in cosmic chemical manufacturing, forging in their fireballs most of the iron and other metals that pervade the universe. The explosions also serve as “standard candles,” assumed to shine with a predictable brightness. Their brightness as seen from Earth provides a cosmic yardstick, used among other things to discover “dark energy,” the unknown force that is accelerating the expansion of the universe. If type Ia supernovae originate as paired white dwarfs, their brightness might not be as consistent as was thought—and they might be less reliable as standard candles.

If type Ia supernovae are not reliable standard candles, then the entire Nobel Prize results that discovered dark energy in the late 1990s are junk, the evidence used to discover it simply unreliable. Dark energy might simply not exist.

What galls me about this possibility is that it was always the case. The certainty in the 1990s about using type Ia supernovae as a standard candle to determine distance was entirely unjustified. Even now astronomers do not really know what causes these explosions. To even consider them to always exhibit the same energy release was just not reasonable.

And yet astronomers in the 1990s did, and thus they fostered the theory of dark energy upon us — that the universe’s expansion was accelerating over vast distances — while winning Nobel Prizes. They still might be right, and dark energy might exist, but it was never very certain, and still is not.

Much of the fault in this does not lie with the astronomers, but with the press, which always likes to sell new theories as a certainty, scoffing over the doubts and areas of ignorance that make the theories questionable. This is just one more example of this, of which I can cite many examples, the worst of all being the reporting about global warming.

Universe’s expansion rate found to differ in different directions

The uncertainty of science: Using data from two space telescopes, astronomers have found that the universe’s expansion rate appears to differ depending on the direction you look.

This latest test uses a powerful, novel and independent technique. It capitalizes on the relationship between the temperature of the hot gas pervading a galaxy cluster and the amount of X-rays it produces, known as the cluster’s X-ray luminosity. The higher the temperature of the gas in a cluster, the higher the X-ray luminosity is. Once the temperature of the cluster gas is measured, the X-ray luminosity can be estimated. This method is independent of cosmological quantities, including the expansion speed of the universe.

Once they estimated the X-ray luminosities of their clusters using this technique, scientists then calculated luminosities using a different method that does depend on cosmological quantities, including the universe’s expansion speed. The results gave the researchers apparent expansion speeds across the whole sky — revealing that the universe appears to be moving away from us faster in some directions than others.

The team also compared this work with studies from other groups that have found indications of a lack of isotropy using different techniques. They found good agreement on the direction of the lowest expansion rate.

More information here.

The other research mentioned in the last paragraph in the quote above describes results posted here in December. For some reason that research did not get the publicity of today’s research, possibly because it had not yet been confirmed by others. It now has.

What this research tells us, most of all, is that dark energy, the mysterious force that is theorized to cause the universe’s expansion rate to accelerate — not slow down as you would expect– might not exist.

Update: I’ve decided to embed, below the fold, the very clear explanatory video made by one of the scientists doing that other research. Very helpful in explaining this very knotty science.

New evidence: dark energy might not exist

The uncertainty of science: New evidence once again suggests that the assumptions that resulted in the invention of dark energy in the late 1990s might have been in error, and that dark energy simply might not exist.

New observations and analysis made by a team of astronomers at Yonsei University (Seoul, South Korea), together with their collaborators at Lyon University and KASI, show, however, that this key assumption is most likely in error. The team has performed very high-quality (signal-to-noise ratio ~175) spectroscopic observations to cover most of the reported nearby early-type host galaxies of SN Ia, from which they obtained the most direct and reliable measurements of population ages for these host galaxies. They find a significant correlation between SN luminosity and stellar population age at a 99.5% confidence level. As such, this is the most direct and stringent test ever made for the luminosity evolution of SN Ia. Since SN progenitors in host galaxies are getting younger with redshift (look-back time), this result inevitably indicates a serious systematic bias with redshift in SN cosmology. Taken at face values, the luminosity evolution of SN is significant enough to question the very existence of dark energy. When the luminosity evolution of SN is properly taken into account, the team found that the evidence for the existence of dark energy simply goes away.

…Other cosmological probes, such as CMB (Cosmic Microwave Background) and BAO (Baryonic Acoustic Oscillations), are also known to provide some indirect and “circumstantial” evidence for dark energy, but it was recently suggested that CMB from Planck mission no longer supports the concordance cosmological model which may require new physics. Some investigators have also shown that BAO and other low-redshift cosmological probes can be consistent with a non-accelerating universe without dark energy. In this respect, the present result showing the luminosity evolution mimicking dark energy in SN cosmology is crucial and is very timely.

There was also this story from early December, also raising questions about the existence of dark energy.

Bottom line: The data that suggested dark energy’s existence was always shallow with many assumptions and large margins of uncertainty. This research only underlines that fact, a fact that many cosmologists have frequently tried to sweep under the rug.

Dark energy still might exist, but it behooves scientists to look coldly at the data and always recognize its weaknesses. It appears in terms of dark energy the cosomological community is finally beginning to do so.

New analysis suggests dark energy might not be necessary

The uncertainty of science: A new peer-reviewed paper in a major astronomy science journal suggests that dark energy might not actually exist, and that the evidence for it might simply be because the original data was biased by the Milky Way’s own movement.

What [the scientists in this new paper] found is that the best fit to the data is that the redshift of supernovae is not the same in all directions, but that it depends on the direction. This direction is aligned with the direction in which we move through the cosmic microwave background. And – most importantly – you do not need further redshift to explain the observations.

If what they say is correct, then it is unnecessary to postulate dark energy which means that the expansion of the universe might not speed up after all.

Why didn’t Perlmutter and Riess [the discoverers of dark energy] come to this conclusion? They could not, because the supernovae that they looked were skewed in direction. The ones with low redshift were in the direction of the CMB dipole; and high redshift ones away from it. With a skewed sample like this, you can’t tell if the effect you see is the same in all directions.

The link is to a blog post by a physicist in the field, commenting on the new paper. Below the fold I have embedded a video from that same physicist that does a nice job of illustrating what she wrote.

This paper does not disprove dark energy. It instead illustrates the large uncertainties involved, as well as show solid evidence that the present consensus favoring the existence of dark energy should be questioned.

But then, that’s how real science works. When the data is sketchy or thin, with many assumptions, it is essential that everyone, especially the scientists in the field, question the results. We shall see now if the physics community will do this.

Hat tip to reader Mike Nelson.

» Read more

Astronomers get best and earliest view of supernovae ever

Using ground-based telescopes as well as the space telescope Kepler astronomers have obtained their best and earliest view of a Type Ia supernova.

The supernova, named SN 2018oh, was brighter than expected over the first few days. The increased brightness is an indication that it slammed into a nearby companion star. This adds to the growing body of evidence that some, but not all, of these thermonuclear supernovae have a large companion star that triggers the explosion.

Las Cumbres Observatory (LCO), based in Goleta, California, is a global network of 21 robotic telescopes that obtained some of the best data characterizing the supernova in support of the NASA mission. Wenxiong Li, the lead author of one of three papers published today on the finding, was based at LCO when much of the research was underway. Five other LCO astronomers, who are affiliated with the University of California Santa Barbara (UCSB), also contributed to two of the papers.

Understanding the origins of Type Ia supernovae is critical because they are used as standard candles to map out distances in cosmology. They were used to discover Dark Energy, the mysterious force causing the universe to accelerate in its expansion. Astronomers have long known that a supernova is the explosion of a dense white dwarf star (A white dwarf has the mass of the sun, but only the radius of the Earth; one teaspoon of a white dwarf would weigh roughly 23000 pounds) What triggers the explosion is less well understood. One theory holds that the explosions are the merger of two white dwarf stars. Another is that the second star is not a white dwarf at all, but a normal-sized or even giant star that loses only some of its matter to the white dwarf to initiate the explosion. In this theory, the explosion then smashes into the surviving second star, causing the supernova to be exceedingly bright in its early hours.

Finding that Type Ia supernovae can be brighter than previously believed throws a wrench into the results that discovered dark energy, since those results made assumptions about the brightness and thus the distance of those supernovae. If the brightness of these supernovae are not as reliable as expected, they are also less of a standard candle for estimating distance.

Dark energy might not exist

The uncertainty of science: A new model for the universe that omits dark energy produces a better fit to what is know than previous theories that included it.

The new theory, dubbed timescape cosmology, includes the known lumpiness of the universe, while the older traditional models that require dark energy do not.

Timescape cosmology has no dark energy. Instead, it includes variations in the effects of gravity caused by the lumpiness in the structure in the universe. Clocks carried by observers in galaxies differ from the clock that best describes average expansion once variations within the universe (known as “inhomogeneity” in the trade) becomes significant. Whether or not one infers accelerating expansion then depends crucially on the clock used. “Timescape cosmology gives a slightly better fit to the largest supernova data catalogue than Lambda Cold Dark Matter cosmology,” says Wiltshire.

He admits the statistical evidence is not yet strong enough to definitively rule in favour of one model over the other, and adds that future missions such as the European Space Agency’s Euclid spacecraft will have the power to distinguish between differing cosmology models.

Both models rely on a very weak data set, based on assumptions about Type 1a supernovae that are likely wrong. It is thus likely that neither explains anything, as neither really has a good picture of the actual universe.

“One of the greatest discoveries of the century is based on these things and we don’t even know what they are, really.”

The uncertainty of science: New research suggests that astronomers have little understanding of the supernovae that they use to estimate the distance to most galaxies, estimates they then used to discover dark energy as well as measure the universe’s expansion rate.

The exploding stars known as type Ia supernovae are so consistently bright that astronomers refer to them as standard candles — beacons that are used to measure vast cosmological distances. But these cosmic mileposts may not be so uniform. A new study finds evidence that the supernovae can arise by two different processes, adding to lingering suspicions that standard candles aren’t so standard after all.

The findings, which have been posted on the arXiv preprint server and accepted for publication in the Astrophysical Journal, could help astronomers to calibrate measurements of the Universe’s expansion. Tracking type Ia supernovae showed that the Universe is expanding at an ever-increasing rate, and helped to prove the existence of dark energy — advances that secured the 2011 Nobel Prize in Physics.

The fact that scientists don’t fully understand these cosmological tools is embarrassing, says the latest study’s lead author, Griffin Hosseinzadeh, an astronomer at the University of California, Santa Barbara. “One of the greatest discoveries of the century is based on these things and we don’t even know what they are, really.”

The key to understanding this situation is to maintain a healthy skepticism about any cosmological theory or discovery, no matter how enthusiastically touted by the press and astronomers. The good astronomers do not push these theories with great enthusiasm as they know the feet of clay on which they stand. The bad ones try to use the ignorant mainstream press to garner attention, and thus funding.

For the past two decades the good astronomers have been diligently checking and rechecking the data and the supernovae used to discover dark energy. Up to now this checking seems to still suggest the universe’s expansion is accelerating on large scales. At the same time, our knowledge of supernovae remains sketchy, and thus no one should assume we understand the universe’s expansion rate with any confidence.

New theory eliminates need for dark energy

The uncertainty of science: A new theory now shows that dark energy, the apparent acceleration of the universe’s expansion rate on large scales, does not need to exist in order to explain the data that astronomers have obtained.

In the new work, the researchers, led by Phd student Gábor Rácz of Eötvös Loránd University in Hungary, question the existence of dark energy and suggest an alternative explanation. They argue that conventional models of cosmology (the study of the origin and evolution of the universe), rely on approximations that ignore its structure, and where matter is assumed to have a uniform density. “Einstein’s equations of general relativity that describe the expansion of the universe are so complex mathematically, that for a hundred years no solutions accounting for the effect of cosmic structures have been found. We know from very precise supernova observations that the universe is accelerating, but at the same time we rely on coarse approximations to Einstein’s equations which may introduce serious side-effects, such as the need for dark energy, in the models designed to fit the observational data.” explains Dr László Dobos, co-author of the paper, also at Eötvös Loránd University.

In practice, normal and dark matter appear to fill the universe with a foam-like structure, where galaxies are located on the thin walls between bubbles, and are grouped into superclusters. The insides of the bubbles are in contrast almost empty of both kinds of matter. Using a computer simulation to model the effect of gravity on the distribution of millions of particles of dark matter, the scientists reconstructed the evolution of the universe, including the early clumping of matter, and the formation of large scale structure.

Unlike conventional simulations with a smoothly expanding universe, taking the structure into account led to a model where different regions of the cosmos expand at different rate. The average expansion rate though is consistent with present observations, which suggest an overall acceleration.

In other words, the uneven structure of the universe has never been considered in previous models, and once included in the equations the need for dark energy disappears.

Expansion rate of the universe might not be accelerating

The uncertainty of science: A new review of the data suggests that the expansion of the universe might not be accelerating as posited based on research done in the 1990s.

Making use of a vastly increased data set – a catalogue of 740 Type Ia supernovae, more than ten times the original sample size – the researchers have found that the evidence for acceleration may be flimsier than previously thought, with the data being consistent with a constant rate of expansion.

The study is published in the Nature journal Scientific Reports.

Professor Sarkar, who also holds a position at the Niels Bohr Institute in Copenhagen, said: ‘The discovery of the accelerating expansion of the universe won the Nobel Prize, the Gruber Cosmology Prize, and the Breakthrough Prize in Fundamental Physics. It led to the widespread acceptance of the idea that the universe is dominated by “dark energy” that behaves like a cosmological constant – this is now the “standard model” of cosmology.

‘However, there now exists a much bigger database of supernovae on which to perform rigorous and detailed statistical analyses. We analysed the latest catalogue of 740 Type Ia supernovae – over ten times bigger than the original samples on which the discovery claim was based – and found that the evidence for accelerated expansion is, at most, what physicists call “3 sigma”. This is far short of the 5 sigma standard required to claim a discovery of fundamental significance.

I am not surprised. In fact, I remain continually skeptical about almost all cosmological theories. They might be the best we have, based on the facts available, but they are also based upon incredibly flimsy facts.

Universe’s expansion rate contradicts dark energy data

The uncertainty of science: New measurements of the universe’s expansion rate, dubbed the Hubble constant, contradict theoretical predictions based on previous data.

For their latest paper, Riess’s team studied two types of standard candles in 18 galaxies using hundreds of hours of observing time on the Hubble Space Telescope. “We’ve been going gangbusters with this,” says Riess.

Their paper, which has been submitted to a journal and posted on the arXiv online repository on 6 April, reports that they measured the constant with an uncertainty of 2.4%, down from a previous best result2 of 3.3%. They find the speed of expansion to be about 8% faster than that predicted based on Planck data, says Riess. [emphasis mine]

I highlight the number of galaxies used to get this data because I think these scientists, are being a bit over-confident about the uncertainty of their data. The universe has untold trillions of galaxies. To say they have narrowed their uncertainty down to only 2.4% based on 18 is the height of silliness.

But then, the lead scientist, Adam Riess, recognizes this, as he is also quoted in the article saying “I think that there is something in the standard cosmological model that we don’t understand.”

Dark energy evidence found to be uncertain

The uncertainty of science: Astronomers have discovered that the type of supernovae they have used as a standard to measure the accelerating expansion of the universe, which also is evidence for the existence of dark energy, are actually made up of two different types.

The authors conclude that some of the reported acceleration of the universe can be explained by color differences between the two groups of supernovae, leaving less acceleration than initially reported. This would, in turn, require less dark energy than currently assumed. “We’re proposing that our data suggest there might be less dark energy than textbook knowledge, but we can’t put a number on it,” Milne said. “Until our paper, the two populations of supernovae were treated as the same population. To get that final answer, you need to do all that work again, separately for the red and for the blue population.”

The authors pointed out that more data have to be collected before scientists can understand the impact on current measures of dark energy.

It has always bothered me that the evidence for dark energy was based entirely on measurements of type 1a supernovae from extremely far away and billions of years ago. Not only was that a different time in the universe’s history when conditions could be different, our actual understanding of those supernovae themselves is very tenuous. We really do not have a full understanding of what causes them, or how they even happen. To then assume that these distant explosions are all so similar that their brightness can be used as a “standard” seems untrustworthy. From my perspective, the conclusions, though interesting, are being pushed based on extremely weak data.

The research at the link illustrates just how weak that data was.

NASA has now agreed to contribute equipment and researchers to a European dark energy mission.

The check is in the mail: NASA has now agreed to contribute equipment and researchers to a European dark energy mission.

And why should Europe have any expectation that NASA will follow through? Europe’s ExoMars project was screwed badly when NASA pulled out last year. Nor was that the first time the U.S. government reneged on a deal with Europe.

Considering the fragile nature of the U.S. federal budget, I wouldn’t depend on anything from NASA or any U.S. government agency for the foreseeable future. And this includes the various private space companies such as SpaceX and Orbital Sciences that are using NASA subsidies to build their spaceships. Get those things built, and quickly! The government money could disappear very soon.

Using data from the Spitzer Space Telescope astronomers have narrowed the universe’s rate of expansion to about 74.3 kilometers per second per megaparsec.

The uncertainty of science: Using data from the Spitzer Space Telescope astronomers have narrowed the universe’s rate of expansion to about 74.3 kilometers per second per megaparsec.

The importance of this number, also called the Hubble Constant, is that it allows astronomers to extrapolate more precisely backward to when they believe the Big Bang occurred, about 13.7 billion years ago. It also is a crucial data point in their effort to understand dark energy, in which this expansion rate is actually accelerating on vast scales.

Back in 1995 a team led by Wendy Freedman, the same scientist leading the work above, announced that they had used the Hubble Space Telescope to determine the expansion rate as 80 kilometers per second per megaparsec. Then, the margin of error was plus or minus 17 kilometers. Now the margin of error has been narrowed to plus or minus 2.1 kilometers.

Do I believe these new numbers? No, not really. Science has nothing to do with belief. I do think this is good science, however, and that this new estimate of the Hubble constant is probably the best yet. I would also not be surprised if in the future new data eventually proves this estimate wrong.

In a paper published today in Science, astronomers show that Type 1a supernovae, the kind used to measure the expansion rate of the universe, can be caused in more than one way, something not previously expected.

The uncertainty of science: In a paper published today in Science, astronomers show that Type 1a supernovae, the kind used to measure the expansion rate of the universe, can be caused in more than one way, something not previously expected.

Andy Howell, second author on the study, said: “It is a total surprise to find that thermonuclear supernovae, which all seem so similar, come from different kinds of stars. It is like discovering that some humans evolved from ape-like ancestors, and others came from giraffes. How could they look so similar if they had such different origins?” Howell is the leader of the supernova group at LCOGT, and is an adjunct faculty member in physics at UCSB.

Recently, some studies have found that Type Ia supernovae are not perfect standard candles –– their brightness depends on the type of galaxy in which they were discovered. The reason is a mystery, but the finding that some Type Ia supernovae come from different progenitors would seem to suggest that the supernova’s ultimate brightness may be affected by whether or not it comes from a nova or a white dwarf merger.

“We don’t think this calls the presence of dark energy into question,” said Dilday. “But it does show that if we want to make progress understanding it, we need to understand supernovae better.”

Astronomers now believe that Type 1a supernovae — used to discover dark energy — can be produced in two different ways.

The uncertainty of science: Astronomers now believe that Type 1a supernovae — used to discover dark energy — can be produced in two different ways.

Type Ia supernovae are known to originate from white dwarfs – the dense cores of dead stars. White dwarfs are also called degenerate stars because they’re supported by quantum degeneracy pressure. In the single-degenerate model for a supernova, a white dwarf gathers material from a companion star until it reaches a tipping point where a runaway nuclear reaction begins and the star explodes. In the double-degenerate model, two white dwarfs merge and explode. Single-degenerate systems should have gas from the companion star around the supernova, while the double-degenerate systems will lack that gas.

For astronomers, this possibility raises several conflicting questions. If two different causes produce Type 1a supernovae, could their measurement of dark energy be suspect? And if not, why is it that these two different causes produce supernovae explosions that look so much alike?

The 2011 Nobel Prize for Physics has been awarded

The 2011 Nobel Prize for Physics has been awarded to the astronomers who discovered dark energy.

Saul Perlmutter from the Lawrence Berkeley National Laboratory and University of California, Berkeley, has been awarded half of this year’s prize for his work on the Supernova Cosmology Project, with the other half awarded to Brian P. Schmidt from the Australian National University and Adam G. Riess from the Johns Hopkins University and Space Telescope Science Institute, Baltimore, for their work on the High-z Supernova Search Team.