Tag Archives: the uncertainty of science

New estimate for Hubble constant differs from previous and also conflicting results

The uncertainty of science: Using microlensing effects scientists have measured a new estimate for the Hubble constant, the rate in which the universe is expanding, and have come up with a number that is different from previous results.

Using adaptive optics technology on the W.M. Keck telescopes in Hawaii, they arrived at an estimate of 76.8 kilometers per second per megaparsec. As a parsec is a bit over 30 trillion kilometers and a megaparsec is a million parsecs, that is an excruciatingly precise measurement. In 2017, the H0LICOW team published an estimate of 71.9, using the same method and data from the Hubble Space Telescope.

The new SHARP/H0LICOW estimates are comparable to that by a team led by Adam Reiss of Johns Hopkins University, 74.03, using measurements of a set of variable stars called the Cepheids. But it’s quite a lot different from estimates of the Hubble constant from an entirely different technique based on the cosmic microwave background. That method, based on the afterglow of the Big Bang, gives a Hubble constant of 67.4, assuming the standard cosmological model of the universe is correct.

An estimate by Wendy Freedman and colleagues at the University of Chicago comes close to bridging the gap, with a Hubble constant of 69.8 based on the luminosity of distant red giant stars and supernovae.

So five different teams have come up with five different numbers, ranging from 67.4 to 76.8 kilometers per second per megaparsec. Based on the present understanding of cosmology, however, the range should have been far less. By now the physicists had expected these different results to be close to the same. The differences suggest that either their theories are wrong, or their methods of measurement are incorrect.

The most likely explanation is that we presently have too little knowledge about the early universe to form any solid theories. These measurements are based on a very tiny amount of data that also require a lot of assumptions.

Share

New data cuts neutrino mass in half

The uncertainty of science: New data now suggests that the highest mass possible for the neutrino is about half the previous estimates.

At the 2019 Topics in Astroparticle and Underground Physics conference in Toyama, Japan, leaders from the KATRIN experiment reported Sept. 13 that the estimated range for the rest mass of the neutrino is no larger than about 1 electron volt, or eV. These inaugural results obtained earlier this year by the Karlsruhe Tritium Neutrino experiment — or KATRIN — cut the mass range for the neutrino by more than half by lowering the upper limit of the neutrino’s mass from 2 eV to about 1 eV. The lower limit for the neutrino mass, 0.02 eV, was set by previous experiments by other groups.

This lower limit does not tell us what the neutrino actually weighs, only reduces the uncertainty of the range of possible masses.

Share

Two new science papers strongly question theory of man-made global warming

The uncertainty of science: Two new science papers, from researchers in Finland and Japan respectively, both strongly question the theory that human activity and the increase of carbon dioxide are causing global warming.

From the Finnish paper’s [pdf] conclusion:

We have proven that the [climate]-models used in IPCC report AR5 cannot compute correctly the natural component included in the observed global temperature. The reason is that the models fail to derive the influences of low cloud cover fraction on the global temperature. A too small natural component results in a too large portion for the contribution of the greenhouse gases like carbon dioxide. That is why 6 J. KAUPPINEN AND P. MALMI IPCC represents the climate sensitivity more than one order of magnitude larger than our sensitivity 0.24°C. Because the anthropogenic portion in the increased CO2 is less than 10 %, we have practically no anthropogenic climate change. The low clouds control mainly the global temperature. [emphasis mine]

From the Japanese paper:

“The Intergovernmental Panel on Climate Change (IPCC) has discussed the impact of cloud cover on climate in their evaluations, but this phenomenon has never been considered in climate predictions due to the insufficient physical understanding of it,” comments Professor Hyodo. “This study provides an opportunity to rethink the impact of clouds on climate. When galactic cosmic rays increase, so do low clouds, and when cosmic rays decrease clouds do as well, so climate warming may be caused by an opposite-umbrella effect. The umbrella effect caused by galactic cosmic rays is important when thinking about current global warming as well as the warm period of the medieval era.”

Essentially, both criticize the climate models for not considering changes in cloud cover and how those effect the global climate. The first paper looks back at the known climate data and compares it with known changes in cloud cover, and finds that cloud cover is a major factor in temperature changes.

The second paper looks at the causes for some of the changes in cloud cover, noting how the increase in galactic cosmic rays during the solar minimum can be tied to an increase in cloud cover, and thus colder temperatures.

Do these papers disprove man-made global warming caused by the increase in carbon dioxide in the atmosphere? Of course not. They just demonstrate again that the science here is very unsettled, that there are many large gaps in our knowledge, and that it would be foolish now to abandon western civilization and replace it with socialist totalitarian rule in order to prevent a disaster that either might not be happening, or if it is we may have no power to control.

I want to also point out that this post talks about scientists challenging the theory of man-made global warming. Attention must be paid to their conclusions. As for the ignorant opinions of politicians on this subject, who cares?

Share

Nearly 400 medical procedures found to be ineffective

The uncertainty of science: A new review of the science literature has found almost 400 studies showing the ineffectiveness of the medical procedure or device they were studying.

The findings are based on more than 15 years of randomised controlled trials, a type of research that aims to reduce bias when testing new treatments. Across 3,000 articles in three leading medical journals from the UK and the US, the authors found 396 reversals.

While these were found in every medical discipline, cardiovascular disease was by far the most commonly represented category, at 20 percent; it was followed by preventative medicine and critical care. Taken together, it appears that medication was the most common reversal at 33 percent; procedures came in second at 20 percent, and vitamins and supplements came in third at 13 percent.

A reversal means that the study found the procedure, device, or medicine to be ineffective.

If you have medical issues it is worth reviewing the research itself. You might find that some of the medical treatment you are getting is irrelevant, and could be discontinued.

Share

New analysis throws wrench in formation theory of spirals in galaxies

The uncertainty of science: A new analysis of over 6000 galaxies suggests that a long-held model for the formation of spirals in galaxies is wrong.

[Edwin] Hubble’s model soon became the authoritative method of classifying spiral galaxies, and is still used widely in astronomy textbooks to this day. His key observation was that galaxies with larger bulges tended to have more tightly wound spiral arms, lending vital support to the ‘density wave’ model of spiral arm formation.

Now though, in contradiction to Hubble’s model, the new work finds no significant correlation between the sizes of the galaxy bulges and how tightly wound the spirals are. This suggests that most spirals are not static density waves after all.

Essentially, we still have no idea why spirals form in galaxies.

Share

Three exocomets found circling Beta Pictoris

The uncertainty of science: By analyzing data from the new space telescope TESS, astronomers think they have identified three exocomets orbiting the nearby star Beta Pictoris.

Why do I label this uncertain? Let the scientists themselves illustrate my doubt:

Sebastian Zieba, Master’s student in the team of Konstanze Zwintz at the Institute of Astro- and Particle Physics at the University of Innsbruck, discovered the signal of the exocomets when he investigated the TESS light curve of Beta Pictoris in March this year. “The data showed a significant decrease in the intensity of the light of the observed star. These variations due to darkening by an object in the star’s orbit can clearly be related to a comet,” Sebastian Zieba and Konstanze Zwintz explain the sensational discovery.

The press release provides no other information about why they think this darkening is because of comets rather than exoplanets or some other phenomenon. Based on this alone, I find this report very doubtful and highly speculative.

In related news, astronomers now claim they have detected eighteen more Earth-sized exoplanets in the data produced by Kepler, and they have done so by applying new algorithms to the data.

Large planets tend to produce deep and clear brightness variations of their host stars so that the subtle center-to-limb brightness variation on the star hardly plays a role in their discovery. Small planets, however, present scientists with immense challenges. Their effect on the stellar brightness is so small that it is extremely hard to distinguish from the natural brightness fluctuations of the star and from the noise that necessarily comes with any kind of observation. René Heller’s team has now been able to show that the sensitivity of the transit method can be significantly improved, if a more realistic light curve is assumed in the search algorithm. “Our new algorithm helps to draw a more realistic picture of the exoplanet population in space,” summarizes Michael Hippke of Sonneberg Observatory. “This method constitutes a significant step forward, especially in the search for Earth-like planets.”

This makes sense, but it must be understood that these are only candidate exoplanets, unconfirmed as yet. I would not be surprised if a majority are found to be false positives.

Share

New Horizons data suggests the Kuiper Belt is emptier that previously believed

The uncertainty of science: An analysis of data from New Horizons now suggests a paucity of small objects in the Kuiper Belt.

Using New Horizons data from the Pluto-Charon flyby in 2015, a Southwest Research Institute-led team of scientists have indirectly discovered a distinct and surprising lack of very small objects in the Kuiper Belt. The evidence for the paucity of small Kuiper Belt objects (KBOs) comes from New Horizons imaging that revealed a dearth of small craters on Pluto’s largest satellite, Charon, indicating that impactors from 300 feet to 1 mile (91 meters to 1.6 km) in diameter must also be rare.

I therefore wonder how the objects we do find there formed. The volume of space in the Kuiper Belt is gigantic, and if the larger bodies found so far are the bulk of the objects there, what did they coalesce from? Moreover, it seems unlikely that the few large objects we have found there would have been able to clear the region out of small objects.

Overall, this is a fundamental mystery tied directly to how the solar system formed, and illustrates how little we know about that process.

Share

Most popular theorized particle for explaining dark matter now eliminated

The uncertainty of science: The WIMP particle (Weakly Interacting Massive Particle), the most popular theorized particle to explain dark matter, has now been eliminated by experiments.

These experiments have now been ongoing for decades, and have seen no dark matter [WIMPs].

…Theorists can always tweak their models, and have done so many times, pushing the anticipated cross-section down and down as null result after null result rolls in. That’s the worst kind of science you can do, however: simply shifting the goalposts for no physical reason other than your experimental constraints have become more severe. There is no longer any motivation, other than preferring a conclusion that the data rules out, in doing.

Other theorized but less favored particles could still be proved to be dark matter, but the problem is getting harder and harder to solve, as presently assumed.

Dark matter has always been an invention created to explain the too-fast orbital velocities of stars in the other regions of galaxies. It could very well be however that the problem comes not from new physics and a newly contrived particle we can’t see, but from a deficiency in our overall observations of galaxies and what is there, within the constraints of the physics we know now.

Hat tip Mike Buford.

Share

New sky survey uncovers hundreds of thousands of previously unknown galaxies

Galaxies without end: A new radio telescope sky survey has discovered hundreds of thousands of previously unknown galaxies.

This discovery is part of a major release of papers outlining a number of discoveries made by this new sky survey.

I could of course also subheaded this post “The uncertainty of science.” Wanna bet that even with this discovery we have only seen the tip of the iceberg of the number of galaxies out there?

Share

The unfinished search for the Hubble constant

The uncertainty of science: Scientists continue to struggle in their still unfinished search for determining the precise expansion rate for the universe, dubbed the Hubble constant in honor of Edwin Hubble, who discovered that expansion.

The problem is, the values obtained from [two different] methods do not agree—a discrepancy cosmologists call “tension.” Calculations from redshift place the figure at about 73 (in units of kilometers per second per megaparsec); the CMB estimates are closer to 68. Most researchers first thought this divergence could be due to errors in measurements (known among astrophysicists as “systematics”). But despite years of investigation, scientists can find no source of error large enough to explain the gap.

I am especially amused by these numbers. Back in 1995 NASA had a big touted press conference to announce that new data from the Hubble Space Telescope had finally determined the exact number for the Hubble constant, 80 (using the standard above). The press went hog wild over this now “certain” conclusion, even though other astronomers disputed it, and offered lower numbers ranging from 30 to 65. Astronomer Allan Sandage of the Carnegie Observatories was especially critical of NASA’s certainty, and was dully ignored by most of the press.

In writing my own article about this result, I was especially struck during my phone interview with Wendy Friedman, the lead scientist for Hubble’s results, by her own certainty. When I noted that her data was very slim, the measurements of only a few stars from one galaxy, she poo-pooed this point. Her result had settled the question!

I didn’t buy her certainty then, and in my article, for The Sciences and entitled most appropriately “The Hubble Inconstant”, made it a point to note Sandage’s doubts. In the end it turns out that Sandage’s proposed number then of between 53 and 65 was a better prediction.

Still, the science for the final number remains unsettled, with two methods coming up with numbers that are a little less than a ten percent different, and no clear explanation for that difference. Isn’t science wonderful?

Share

No Planet X needed

The uncertainty of science: New computer models now suggest that the orbits of the known Kuiper Belt objects can be explained without the need for the theorized large Planet X.

The weirdly clustered orbits of some far-flung bodies in our solar system can be explained without invoking a big, undiscovered “Planet Nine,” a new study suggests.

The shepherding gravitational pull could come from many fellow trans-Neptunian objects (TNOs) rather than a single massive world, according to the research.

“If you remove Planet Nine from the model, and instead allow for lots of small objects scattered across a wide area, collective attractions between those objects could just as easily account for the eccentric orbits we see in some TNOs,” study lead author Antranik Sefilian, a doctoral student in the Department of Applied Mathematics and Theoretical Physics at Cambridge University in England, said in a statement.

When you think about it, having many many scattered small objects in the Kuiper Belt makes much more sense than a few giant planets. Out there, it would be difficult for large objects to coalesce from the solar system’s initial accretion disk. The density of material would be too low. However, you might get a lot of small objects from that disk, which once formed would be too far apart to accrete into larger planets.

The use of the term “Planet Nine” by these scientists however is somewhat annoying, and that has less to do with Pluto and more to do with the general understanding of what it means to be a planet that has been evolving in the past two decades. There are clearly more than eight planets known in the solar system now. The large moons of the gas giants as well as the larger dwarf planets, such as Ceres, have been shown to have all the complex features of planets. And fundamentally, they are large enough to be spheres, not misshaped asteroids.

Share

Four more gravitational wave detections

The uncertainty of science: The scientists running the LIGO gravitational wave detector have announced the detection of four more gravitational waves, bringing to eleven the total number so far observed.

During the first observing run O1, from September 12, 2015 to January 19, 2016, gravitational waves from three BBH mergers were detected. The second observing run, which lasted from November 30, 2016, to August 25, 2017, yielded a binary neutron star merger and seven additional binary black hole mergers, including the four new gravitational wave events being reported now. The new events are known as GW170729, GW170809, GW170818 and GW170823 based on the dates on which they were detected. With the detection of four additional BBH mergers the scientists learn more about the population of these binary systems in the universe and about the event rate for these types of coalescences.

The observed BBHs span a wide range of component masses, from 7.6 to 50.6 solar masses. The new event GW170729 is the most massive and distant gravitational-wave source ever observed. In this coalescence, which happened roughly 5 billion years ago, an equivalent energy of almost five solar masses was converted into gravitational radiation.

In two BBHs (GW151226 and GW170729) it is very likely that at least one of the merging black holes is spinning. One of the new events, GW170818, detected by the LIGO and Virgo observatories, was very precisely pinpointed in the sky. It is the best localized BBH to date: its position has been identified with a precision of 39 square degrees (195 times the apparent size of the full moon) in the northern celestial hemisphere. [emphasis mine]

The highlighted quote above illustrates the amount of uncertainty here. Though these appear to be gravitational waves, and have been confirmed in multiple ways, the data is very coarse, providing only a limited amount of basic information about each event. This limited information is still very valuable, and certainly advances our understanding of black holes and their formation, but it is important to recognize the limitations of that data.

Share

Danish astronomers question gravitational wave detection

The uncertainty of science: A team of Danish astronomers have questioned the gravitational wave detection achieved in the past few years by the LIGO gravitational wave telescopes.

The details are complex and very much in dispute, and the position of these Danish astronomers is very much in the minority, but their doubts have not been dismissed, and illustrate well the best aspects science. The article also outlines how the physics community and the LIGO scientists have welcomed the skepticism, even as they have doubts about the claims of the Danish astronomers. This is the hallmark of good science, and lends weight to the work at LIGO.

Share

Astronomers retract prediction of star merger

The uncertainty of science: A review of the data has caused astronomers to retract a prediction that the two stars in a binary system were going to merge in 2022.

It appears that the mistaken prediction occurred because of a typo in the dataset they were using. The new analysis pinpointed this, which once corrected showed that no star merger is going to take place.

Share

Conflict in Hubble constant increases with new data from Hubble and Gaia

The uncertainty of science: New data from the Hubble Space Telescope and Gaia continues to measure a different Hubble constant for the expansion rate of the universe, when compared with data from the Planck space telescope.

Using Hubble and newly released data from Gaia, Riess’ team measured the present rate of expansion to be 73.5 kilometers (45.6 miles) per second per megaparsec. This means that for every 3.3 million light-years farther away a galaxy is from us, it appears to be moving 73.5 kilometers per second faster. However, the Planck results predict the universe should be expanding today at only 67.0 kilometers (41.6 miles) per second per megaparsec. As the teams’ measurements have become more and more precise, the chasm between them has continued to widen, and is now about 4 times the size of their combined uncertainty.

The problem really is very simple: We haven’t the faintest idea what is going on. We have some data, but we also have enormous gaps in our knowledge of the cosmos. Moreover, most of our cosmological data is reliant on too many assumptions that could be wrong, or simply in error. And the errors can be tiny and still throw the results off by large amounts.

The one thing that good science and skepticism teaches is humbleness. Do not be too sure of your conclusions. The universe is a large and complex place. It likes to throw curve balls at us, and if we swing too soon we will certainly miss.

Share

Astronomers dispute existence of galaxy without dark matter

The uncertainty of science: A new analysis by astronomers disputes the conclusion of different astronomers earlier this year that they had found a galaxy that lacked any dark matter.

The original paper from March based its stunning claim of a dark-matter-free galaxy on the way clusters of stars moved through the thin, diffuse galaxy called NGC1052–DF2: They appeared to move at exactly the speed Einstein’s equations of general relativity would predict based on the visible matter (so, slower than they would if the galaxy held dark matter).

This new paper on arXiv suggested otherwise: First, the authors pointed out that NGC1052–DF2 was already discovered way back in 1976 and has previously been referred to by three different names: KKSG04, PGC3097693 and [KKS2000]04.

Then, using those names and then finding all the available data on the galaxy, the researchers argued that the researchers from the March paper simply mismeasured the distance between that galaxy and Earth. This means the galaxy is probably much closer to us than the original researchers thought.

Astronomers calculate the mass of galaxies based on the objects’ brightness and distance. If the galaxy examined in the paper is closer to Earth than previously thought, then its dimness means it’s also much less massive than researchers thought. And at the newly calculated, lighter mass, all the other features of the galaxy make a lot more sense, the researchers in the new paper said. Its globular clusters aren’t moving slowly because they’re in some strange dark matter-desert; instead, they’re moving at the regular speed for a very lightweight galaxy, the arXiv authors said.

To put it bluntly, the astronomers don’t have enough solid data to decide this issue one way or the other. Moreover, the dispute indicates once again that the whole dark matter theory itself is based on very limited data with large margins of error. It might be the best theory we’ve got to explain the data we have, but no good scientist takes it too seriously. We just don’t know enough yet.

Share

Hubble finds new figure for universe expansion rate

The uncertainty of science: Using data from the Hubble Space Telescope astronomers have found evidence that universe’s expansion rate is faster than estimated in previous measurements.

The new findings show that eight Cepheid variables in our Milky Way galaxy are up to 10 times farther away than any previously analyzed star of this kind. Those Cepheids are more challenging to measure than others because they reside between 6,000 and 12,000 light-years from Earth. To handle that distance, the researchers developed a new scanning technique that allowed the Hubble Space Telescope to periodically measure a star’s position at a rate of 1,000 times per minute, thus increasing the accuracy of the stars’ true brightness and distance, according to the statement.

The researchers compared their findings to earlier data from the European Space Agency’s (ESA) Planck satellite. During its four-year mission, the Planck satellite mapped leftover radiation from the Big Bang, also known as the cosmic microwave background. The Planck data revealed a Hubble constant between 67 and 69 kilometers per second per megaparsec. (A megaparsec is roughly 3 million light-years.)

However, the Planck data gives a constant about 9 percent lower than that of the new Hubble measurements, which estimate that the universe is expanding at 73 kilometers per second per megaparsec, therefore suggesting that galaxies are moving faster than expected, according to the statement.

“Both results have been tested multiple ways, so barring a series of unrelated mistakes, it is increasingly likely that this is not a bug but a feature of the universe,” Riess said. [emphasis mine]

I should point out that one of the first big results from Hubble in 1995 (which also happened to be the subject one of my early published stories), the estimate then for the Hubble constant was 80 kilometers per second per megaparsec. At the time, the astronomers who did the research were very certain they had it right. Others have theorized that the number could be as low as 30 kilometers per second per megaparsec.

What is important about this number is that it determines how long ago the Big Bang is thought to have occurred. Lower numbers mean it took place farther in the past. Higher numbers mean the universe is very young.

That scientists keep getting different results only suggests to me that they simply do not yet have enough data to lock the number down firmly.

Share

Chinese space probe detects possible dark matter signal

The uncertainty of science: A Chinese space probe designed to measure cosmic rays has detected a pattern that could be evidence of the existence of dark matter.

Researchers launched the spacecraft from the Jiuquan Satellite Launch Center in the Gobi Desert, about 1600 kilometers west of Beijing, in December 2015. Its primary instrument—a stack of thin, crisscrossed detector strips—is tuned to observe the incoming direction, energy, and electric charge of the particles that make up cosmic rays, particularly electrons and positrons, the antimatter counterparts of electrons. Cosmic rays emanate from conventional astrophysical objects, like exploding supernovae in the galaxy. But if dark matter consists of WIMPs, these would occasionally annihilate each other and create electron-positron pairs, which might be detected as an excess over the expected abundance of particles from conventional objects.

In its first 530 days of scientific observations, DAMPE detected 1.5 million cosmic ray electrons and positrons above a certain energy threshold. When researchers plot of the number of particles against their energy, they’d expect to see a smooth curve. But previous experiments have hinted at an anomalous break in the curve. Now, DAMPE has confirmed that deviation. “It may be evidence of dark matter,” but the break in the curve “may be from some other cosmic ray source,” says astrophysicist Chang Jin, who leads the collaboration at the Chinese Academy of Science’s (CAS’s) Purple Mountain Observatory (PMO) in Nanjing. [emphasis mine]

I must emphasize the large uncertainty here. They have not detected dark matter. Not even close. What they have detected is a pattern in how the spacecraft is detecting cosmic rays that was predicted by the existence of dark matter. That pattern however could have other causes, and the consistent failure of other efforts to directly find dark matter strengthens the possibility that this break is caused by those other causes.

Share

A spot on Mars, as seen by different orbiters over the past half century

Mars as seen over the past half century

The science team of Mars Reconnaissance Orbiter (MRO) have assembled a collection of images of the same location on Mars that were taken by different Martian orbiters, beginning with the first fly-by by Mariner 4 in 1965 and ending with MRO’s HiRise camera. The image on the right, reduced in resolution to post here, shows these images superimposed on that location, with resolutions ranging from 1.25 kilometers per pixel (Mariner 4) down to 50 meters per pixel (MRO).

This mosaic essentially captures the technological history of the first half century of space exploration in a single image. Mariner 4 was only able to take 22 fuzzy pictures during its fly-by. Today’s orbiters take thousands and thousands, with resolutions so sharp they can often identify small rocks and boulders.

The mosaic also illustrates well the uncertainty of science. When Mariner 4 took the first pictures some scientists thought that there might be artificially built canals on Mars. Instead, the probe showed a dead cratered world much like the Moon. Later images proved that conclusion to be wrong as well, with today’s images showing Mars to be a very complex and active world, with a geological history both baffling and dynamic. Even now, after a half century of improved observations, we still are unsure whether life there once existed, or even if exists today.

Share

Dark energy might not exist

The uncertainty of science: A new model for the universe that omits dark energy produces a better fit to what is know than previous theories that included it.

The new theory, dubbed timescape cosmology, includes the known lumpiness of the universe, while the older traditional models that require dark energy do not.

Timescape cosmology has no dark energy. Instead, it includes variations in the effects of gravity caused by the lumpiness in the structure in the universe. Clocks carried by observers in galaxies differ from the clock that best describes average expansion once variations within the universe (known as “inhomogeneity” in the trade) becomes significant. Whether or not one infers accelerating expansion then depends crucially on the clock used. “Timescape cosmology gives a slightly better fit to the largest supernova data catalogue than Lambda Cold Dark Matter cosmology,” says Wiltshire.

He admits the statistical evidence is not yet strong enough to definitively rule in favour of one model over the other, and adds that future missions such as the European Space Agency’s Euclid spacecraft will have the power to distinguish between differing cosmology models.

Both models rely on a very weak data set, based on assumptions about Type 1a supernovae that are likely wrong. It is thus likely that neither explains anything, as neither really has a good picture of the actual universe.

Share

“One of the greatest discoveries of the century is based on these things and we don’t even know what they are, really.”

The uncertainty of science: New research suggests that astronomers have little understanding of the supernovae that they use to estimate the distance to most galaxies, estimates they then used to discover dark energy as well as measure the universe’s expansion rate.

The exploding stars known as type Ia supernovae are so consistently bright that astronomers refer to them as standard candles — beacons that are used to measure vast cosmological distances. But these cosmic mileposts may not be so uniform. A new study finds evidence that the supernovae can arise by two different processes, adding to lingering suspicions that standard candles aren’t so standard after all.

The findings, which have been posted on the arXiv preprint server and accepted for publication in the Astrophysical Journal, could help astronomers to calibrate measurements of the Universe’s expansion. Tracking type Ia supernovae showed that the Universe is expanding at an ever-increasing rate, and helped to prove the existence of dark energy — advances that secured the 2011 Nobel Prize in Physics.

The fact that scientists don’t fully understand these cosmological tools is embarrassing, says the latest study’s lead author, Griffin Hosseinzadeh, an astronomer at the University of California, Santa Barbara. “One of the greatest discoveries of the century is based on these things and we don’t even know what they are, really.”

The key to understanding this situation is to maintain a healthy skepticism about any cosmological theory or discovery, no matter how enthusiastically touted by the press and astronomers. The good astronomers do not push these theories with great enthusiasm as they know the feet of clay on which they stand. The bad ones try to use the ignorant mainstream press to garner attention, and thus funding.

For the past two decades the good astronomers have been diligently checking and rechecking the data and the supernovae used to discover dark energy. Up to now this checking seems to still suggest the universe’s expansion is accelerating on large scales. At the same time, our knowledge of supernovae remains sketchy, and thus no one should assume we understand the universe’s expansion rate with any confidence.

Share

Climate scientists increasingly show no warming in peer review papers

The uncertainty of science: Climate scientists are increasingly publishing papers that show no clear temperature global trend.

Last year there were at least 60 peer-reviewed papers published in scientific journals demonstrating that Today’s Warming Isn’t Global, Unprecedented, Or Remarkable.

.
Just within the last 5 months, 58 more papers and 80 new graphs have been published that continue to undermine the popularized conception of a slowly cooling Earth temperature history followed by a dramatic hockey-stick-shaped uptick, or an especially unusual global-scale warming during modern times.
.
Yes, some regions of the Earth have been warming in recent decades or at some point in the last 100 years. Some regions have been cooling for decades at a time. And many regions have shown no significant net changes or trends in either direction relative to the last few hundred to thousands of years.
.
Succinctly, then, scientists publishing in peer-reviewed journals have increasingly affirmed that there is nothing historically unprecedented or remarkable about today’s climate when viewed in the context of long-term natural variability. [emphasis in original]

At the link are 80 graphs from the most recent papers. Take a look. If you are convinced that the climate is warming than you must come up with an explanation for this data. Or you can put your fingers in your ears, cover your eyes, and chant “La-la-la-la-la-la-la-la-la-la-la-la-la-la-la-la!” as loud as you can so that you don’t have to deal with it.

Share

The oldest fossils ever?

Scientists think they have found the oldest fossils ever in Canada.

Scientists say they have found the world’s oldest fossils, thought to have formed between 3.77bn and 4.28bn years ago. Comprised of tiny tubes and filaments made of an iron oxide known as haematite, the microfossils are believed to be the remains of bacteria that once thrived underwater around hydrothermal vents, relying on chemical reactions involving iron for their energy.

If correct, these fossils offer the oldest direct evidence for life on the planet. And that, the study’s authors say, offers insights into the origins of life on Earth. “If these rocks do indeed turn out to be 4.28 [bn years old] then we are talking about the origins of life developing very soon after the oceans formed 4.4bn years ago,” said Matthew Dodd, the first author of the research from University College, London.

This discovery reminds me of the Mars fossils discovered in the late 1990s. There were enormous uncertainties with that discovery, all of which eventually caused most scientists in the field to reject the result. The same thing could be the case here.

Still in Dallas. I hope to get caught up tomorrow.

Share

MRI software bug invalidates 40,000 research papers

The uncertainty of science: A bug just discovered in the computer software used by MRIs to measure brain activity could invalidate 15 years of research and 40,000 science papers.

They tested the three most popular fMRI software packages for fMRI analysis – SPM, FSL, and AFNI – and while they shouldn’t have found much difference across the groups, the software resulted in false-positive rates of up to 70 percent. And that’s a problem, because as Kate Lunau at Motherboard points out, not only did the team expect to see an average false positive rate of just 5 percent, it also suggests that some results were so inaccurate, they could be indicating brain activity where there was none.

“These results question the validity of some 40,000 fMRI studies and may have a large impact on the interpretation of neuroimaging results,” the team writes in PNAS. The bad news here is that one of the bugs the team identified has been in the system for the past 15 years, which explains why so many papers could now be affected. [emphasis mine]

The research the article described is focused entirely on the problems the software causes for past research. It makes no mention of the problems this software bug might cause for actual medical diagnosis Was the treatment of any patients effected by this bug? It does not say.

Share

New data challenges consensus on galaxy formation

The uncertainty of science: A new study has found that the accepted consensus for the formation of large elliptical galaxies does not work, and that, rather than forming from the merger of smaller spiral galaxies, ellipticals formed in place from the material at hand.

From the press release [pdf].

“We started from the data, available in complete form only for the closer galaxies and in incomplete form for the more distant ones, and we filled the ‘gaps’ by interpreting and extending the data based on a scenario we devised” comments Mancuso. The analysis also took into account the phenomenon of gravitational lensing, which allows us to observe very distant galaxies belonging to ancient cosmic epochs.

In this “direct” manner (i.e., model-independent) the SISSA group obtained an image of the evolution of galaxies even in very ancient epochs (close, in a cosmic timescale, to the epoch of reionization). This reconstruction demonstrates that elliptical galaxies cannot have formed through the merging of other galaxies, “simply because there wasn’t enough time to accumulate the large quantity of stars seen in these galaxies through these processes”, comments Mancuso. “This means that the formation of elliptical galaxies occurs through internal, in situ processes of star formation.

The important take-away of this result is that it shows that the present theory of galaxy formation, where smaller spiral galaxies merge to form larger elliptical galaxies, does not fit the data. And if a theory does not fit the data, it must be abandoned.

Share