Tag Archives: the uncertainty of science

Hubble finds new figure for universe expansion rate

The uncertainty of science: Using data from the Hubble Space Telescope astronomers have found evidence that universe’s expansion rate is faster than estimated in previous measurements.

The new findings show that eight Cepheid variables in our Milky Way galaxy are up to 10 times farther away than any previously analyzed star of this kind. Those Cepheids are more challenging to measure than others because they reside between 6,000 and 12,000 light-years from Earth. To handle that distance, the researchers developed a new scanning technique that allowed the Hubble Space Telescope to periodically measure a star’s position at a rate of 1,000 times per minute, thus increasing the accuracy of the stars’ true brightness and distance, according to the statement.

The researchers compared their findings to earlier data from the European Space Agency’s (ESA) Planck satellite. During its four-year mission, the Planck satellite mapped leftover radiation from the Big Bang, also known as the cosmic microwave background. The Planck data revealed a Hubble constant between 67 and 69 kilometers per second per megaparsec. (A megaparsec is roughly 3 million light-years.)

However, the Planck data gives a constant about 9 percent lower than that of the new Hubble measurements, which estimate that the universe is expanding at 73 kilometers per second per megaparsec, therefore suggesting that galaxies are moving faster than expected, according to the statement.

“Both results have been tested multiple ways, so barring a series of unrelated mistakes, it is increasingly likely that this is not a bug but a feature of the universe,” Riess said. [emphasis mine]

I should point out that one of the first big results from Hubble in 1995 (which also happened to be the subject one of my early published stories), the estimate then for the Hubble constant was 80 kilometers per second per megaparsec. At the time, the astronomers who did the research were very certain they had it right. Others have theorized that the number could be as low as 30 kilometers per second per megaparsec.

What is important about this number is that it determines how long ago the Big Bang is thought to have occurred. Lower numbers mean it took place farther in the past. Higher numbers mean the universe is very young.

That scientists keep getting different results only suggests to me that they simply do not yet have enough data to lock the number down firmly.


Chinese space probe detects possible dark matter signal

The uncertainty of science: A Chinese space probe designed to measure cosmic rays has detected a pattern that could be evidence of the existence of dark matter.

Researchers launched the spacecraft from the Jiuquan Satellite Launch Center in the Gobi Desert, about 1600 kilometers west of Beijing, in December 2015. Its primary instrument—a stack of thin, crisscrossed detector strips—is tuned to observe the incoming direction, energy, and electric charge of the particles that make up cosmic rays, particularly electrons and positrons, the antimatter counterparts of electrons. Cosmic rays emanate from conventional astrophysical objects, like exploding supernovae in the galaxy. But if dark matter consists of WIMPs, these would occasionally annihilate each other and create electron-positron pairs, which might be detected as an excess over the expected abundance of particles from conventional objects.

In its first 530 days of scientific observations, DAMPE detected 1.5 million cosmic ray electrons and positrons above a certain energy threshold. When researchers plot of the number of particles against their energy, they’d expect to see a smooth curve. But previous experiments have hinted at an anomalous break in the curve. Now, DAMPE has confirmed that deviation. “It may be evidence of dark matter,” but the break in the curve “may be from some other cosmic ray source,” says astrophysicist Chang Jin, who leads the collaboration at the Chinese Academy of Science’s (CAS’s) Purple Mountain Observatory (PMO) in Nanjing. [emphasis mine]

I must emphasize the large uncertainty here. They have not detected dark matter. Not even close. What they have detected is a pattern in how the spacecraft is detecting cosmic rays that was predicted by the existence of dark matter. That pattern however could have other causes, and the consistent failure of other efforts to directly find dark matter strengthens the possibility that this break is caused by those other causes.


A spot on Mars, as seen by different orbiters over the past half century

Mars as seen over the past half century

The science team of Mars Reconnaissance Orbiter (MRO) have assembled a collection of images of the same location on Mars that were taken by different Martian orbiters, beginning with the first fly-by by Mariner 4 in 1965 and ending with MRO’s HiRise camera. The image on the right, reduced in resolution to post here, shows these images superimposed on that location, with resolutions ranging from 1.25 kilometers per pixel (Mariner 4) down to 50 meters per pixel (MRO).

This mosaic essentially captures the technological history of the first half century of space exploration in a single image. Mariner 4 was only able to take 22 fuzzy pictures during its fly-by. Today’s orbiters take thousands and thousands, with resolutions so sharp they can often identify small rocks and boulders.

The mosaic also illustrates well the uncertainty of science. When Mariner 4 took the first pictures some scientists thought that there might be artificially built canals on Mars. Instead, the probe showed a dead cratered world much like the Moon. Later images proved that conclusion to be wrong as well, with today’s images showing Mars to be a very complex and active world, with a geological history both baffling and dynamic. Even now, after a half century of improved observations, we still are unsure whether life there once existed, or even if exists today.


Dark energy might not exist

The uncertainty of science: A new model for the universe that omits dark energy produces a better fit to what is know than previous theories that included it.

The new theory, dubbed timescape cosmology, includes the known lumpiness of the universe, while the older traditional models that require dark energy do not.

Timescape cosmology has no dark energy. Instead, it includes variations in the effects of gravity caused by the lumpiness in the structure in the universe. Clocks carried by observers in galaxies differ from the clock that best describes average expansion once variations within the universe (known as “inhomogeneity” in the trade) becomes significant. Whether or not one infers accelerating expansion then depends crucially on the clock used. “Timescape cosmology gives a slightly better fit to the largest supernova data catalogue than Lambda Cold Dark Matter cosmology,” says Wiltshire.

He admits the statistical evidence is not yet strong enough to definitively rule in favour of one model over the other, and adds that future missions such as the European Space Agency’s Euclid spacecraft will have the power to distinguish between differing cosmology models.

Both models rely on a very weak data set, based on assumptions about Type 1a supernovae that are likely wrong. It is thus likely that neither explains anything, as neither really has a good picture of the actual universe.


“One of the greatest discoveries of the century is based on these things and we don’t even know what they are, really.”

The uncertainty of science: New research suggests that astronomers have little understanding of the supernovae that they use to estimate the distance to most galaxies, estimates they then used to discover dark energy as well as measure the universe’s expansion rate.

The exploding stars known as type Ia supernovae are so consistently bright that astronomers refer to them as standard candles — beacons that are used to measure vast cosmological distances. But these cosmic mileposts may not be so uniform. A new study finds evidence that the supernovae can arise by two different processes, adding to lingering suspicions that standard candles aren’t so standard after all.

The findings, which have been posted on the arXiv preprint server and accepted for publication in the Astrophysical Journal, could help astronomers to calibrate measurements of the Universe’s expansion. Tracking type Ia supernovae showed that the Universe is expanding at an ever-increasing rate, and helped to prove the existence of dark energy — advances that secured the 2011 Nobel Prize in Physics.

The fact that scientists don’t fully understand these cosmological tools is embarrassing, says the latest study’s lead author, Griffin Hosseinzadeh, an astronomer at the University of California, Santa Barbara. “One of the greatest discoveries of the century is based on these things and we don’t even know what they are, really.”

The key to understanding this situation is to maintain a healthy skepticism about any cosmological theory or discovery, no matter how enthusiastically touted by the press and astronomers. The good astronomers do not push these theories with great enthusiasm as they know the feet of clay on which they stand. The bad ones try to use the ignorant mainstream press to garner attention, and thus funding.

For the past two decades the good astronomers have been diligently checking and rechecking the data and the supernovae used to discover dark energy. Up to now this checking seems to still suggest the universe’s expansion is accelerating on large scales. At the same time, our knowledge of supernovae remains sketchy, and thus no one should assume we understand the universe’s expansion rate with any confidence.


Climate scientists increasingly show no warming in peer review papers

The uncertainty of science: Climate scientists are increasingly publishing papers that show no clear temperature global trend.

Last year there were at least 60 peer-reviewed papers published in scientific journals demonstrating that Today’s Warming Isn’t Global, Unprecedented, Or Remarkable.

Just within the last 5 months, 58 more papers and 80 new graphs have been published that continue to undermine the popularized conception of a slowly cooling Earth temperature history followed by a dramatic hockey-stick-shaped uptick, or an especially unusual global-scale warming during modern times.
Yes, some regions of the Earth have been warming in recent decades or at some point in the last 100 years. Some regions have been cooling for decades at a time. And many regions have shown no significant net changes or trends in either direction relative to the last few hundred to thousands of years.
Succinctly, then, scientists publishing in peer-reviewed journals have increasingly affirmed that there is nothing historically unprecedented or remarkable about today’s climate when viewed in the context of long-term natural variability. [emphasis in original]

At the link are 80 graphs from the most recent papers. Take a look. If you are convinced that the climate is warming than you must come up with an explanation for this data. Or you can put your fingers in your ears, cover your eyes, and chant “La-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la!” as loud as you can so that you don’t have to deal with it.


The oldest fossils ever?

Scientists think they have found the oldest fossils ever in Canada.

Scientists say they have found the world’s oldest fossils, thought to have formed between 3.77bn and 4.28bn years ago. Comprised of tiny tubes and filaments made of an iron oxide known as haematite, the microfossils are believed to be the remains of bacteria that once thrived underwater around hydrothermal vents, relying on chemical reactions involving iron for their energy.

If correct, these fossils offer the oldest direct evidence for life on the planet. And that, the study’s authors say, offers insights into the origins of life on Earth. “If these rocks do indeed turn out to be 4.28 [bn years old] then we are talking about the origins of life developing very soon after the oceans formed 4.4bn years ago,” said Matthew Dodd, the first author of the research from University College, London.

This discovery reminds me of the Mars fossils discovered in the late 1990s. There were enormous uncertainties with that discovery, all of which eventually caused most scientists in the field to reject the result. The same thing could be the case here.

Still in Dallas. I hope to get caught up tomorrow.


MRI software bug invalidates 40,000 research papers

The uncertainty of science: A bug just discovered in the computer software used by MRIs to measure brain activity could invalidate 15 years of research and 40,000 science papers.

They tested the three most popular fMRI software packages for fMRI analysis – SPM, FSL, and AFNI – and while they shouldn’t have found much difference across the groups, the software resulted in false-positive rates of up to 70 percent. And that’s a problem, because as Kate Lunau at Motherboard points out, not only did the team expect to see an average false positive rate of just 5 percent, it also suggests that some results were so inaccurate, they could be indicating brain activity where there was none.

“These results question the validity of some 40,000 fMRI studies and may have a large impact on the interpretation of neuroimaging results,” the team writes in PNAS. The bad news here is that one of the bugs the team identified has been in the system for the past 15 years, which explains why so many papers could now be affected. [emphasis mine]

The research the article described is focused entirely on the problems the software causes for past research. It makes no mention of the problems this software bug might cause for actual medical diagnosis Was the treatment of any patients effected by this bug? It does not say.


New data challenges consensus on galaxy formation

The uncertainty of science: A new study has found that the accepted consensus for the formation of large elliptical galaxies does not work, and that, rather than forming from the merger of smaller spiral galaxies, ellipticals formed in place from the material at hand.

From the press release [pdf].

“We started from the data, available in complete form only for the closer galaxies and in incomplete form for the more distant ones, and we filled the ‘gaps’ by interpreting and extending the data based on a scenario we devised” comments Mancuso. The analysis also took into account the phenomenon of gravitational lensing, which allows us to observe very distant galaxies belonging to ancient cosmic epochs.

In this “direct” manner (i.e., model-independent) the SISSA group obtained an image of the evolution of galaxies even in very ancient epochs (close, in a cosmic timescale, to the epoch of reionization). This reconstruction demonstrates that elliptical galaxies cannot have formed through the merging of other galaxies, “simply because there wasn’t enough time to accumulate the large quantity of stars seen in these galaxies through these processes”, comments Mancuso. “This means that the formation of elliptical galaxies occurs through internal, in situ processes of star formation.

The important take-away of this result is that it shows that the present theory of galaxy formation, where smaller spiral galaxies merge to form larger elliptical galaxies, does not fit the data. And if a theory does not fit the data, it must be abandoned.