Conflict in Hubble constant continues to confound astronomers

The uncertainty of science: In reviewing their measurements of the Hubble constant using a variety of proxy distance tools, such as distant supernovae, astronomers recently announced that their numbers must be right, even though those numbers do not match the Hubble constant measured using completely different tools.

Most measurements of the current acceleration of the universe (called the Hubble constant, or H0) based on stars and other objects relatively close to Earth give a rate of 73 km/s/Mpc. These are referred to as “late-time” measurements [the same as confirmed by the astronomers in the above report]. On the other hand, early-time measurements, which are based on the cosmic microwave background emitted just 380,000 years after the Big Bang, give a smaller rate of 68 km/s/Mpc.

They can’t both be right. Either something is wrong with the standard cosmological model for our universe’s evolution, upon which the early-time measurements rest, or something is wrong with the way scientists are working with late-time observations.

The astronomers are now claiming that their late-time observations must be right, which really means there is either something about the present theories about the Big Bang that are fundamentally wrong and that our understanding of early cosmology is very incomplete, or the measurements by everyone are faulty.

Based on the number of assumptions used with both measurements, it is not surprising the results don’t match. Some of those assumptions are certainly wrong, but to correct the error will require a lot more data that will only become available when astronomers have much bigger telescopes of all kinds, in space and above the atmosphere. Their present tools on Earth are insufficient for untangling this mystery.

Astronomers discover galaxy with no dark matter

The uncertainty of science: Astronomers have detected a galaxy about 250 million light years away that shows no evidence of any dark matter, a phenomenon that defies the accepted theories about dark matter.

The galaxy in question, AGC 114905, is about 250 million light-years away. It is classified as an ultra-diffuse dwarf galaxy, with the name ‘dwarf galaxy’ referring to its luminosity and not to its size. The galaxy is about the size of our own Milky Way but contains a thousand times fewer stars. The prevailing idea is that all galaxies, and certainly ultra-diffuse dwarf galaxies, can only exist if they are held together by dark matter.
Galaxy AGC 114905

The researchers collected data on the rotation of gas in AGC 114905 for 40 hours between July and October 2020 using the VLA telescope. Subsequently, they made a graph showing the distance of the gas from the center of the galaxy on the x-axis and the rotation speed of the gas on the y-axis. This is a standard way to reveal the presence of dark matter. The graph shows that the motions of the gas in AGC 114905 can be completely explained by just normal matter.

“This is, of course, what we thought and hoped for because it confirms our previous measurements,” says Pavel Mancera Piña. “But now the problem remains that the theory predicts that there must be dark matter in AGC 114905, but our observations say there isn’t. In fact, the difference between theory and observation is only getting bigger.”

The evidence for dark matter in almost all galaxies is the motion of gas and stars in the outer perimeter. Routinely they move faster than expected based merely on visible ordinary matter. To account for the faster speed, astronomers beginning in the late 1950s invented dark matter, an invisible material with a mass sufficient to increase the speeds of objects and gas in the outer regions of galaxies.

That increasingly astronomers are finding galaxies with no evidence of dark matter, based on rotation speeds, only makes this mystery all the more baffling.

Galaxies in the early universe don’t fit the theories

The uncertainty of science: New data from both the ALMA telescope in Chile and the Hubble Space Telescope about six massive galaxies in the early universe suggest that there are problems and gaps in the presently accepted theories about the universe’s formation.

Early massive galaxies—those that formed in the three billion years following the Big Bang should have contained large amounts of cold hydrogen gas, the fuel required to make stars. But scientists observing the early Universe with the Atacama Large Millimeter/submillimeter Array (ALMA) and the Hubble Space Telescope have spotted something strange: half a dozen early massive galaxies that ran out of fuel. The results of the research are published today in Nature.

Known as “quenched” galaxies—or galaxies that have shut down star formation—the six galaxies selected for observation from the REsolving QUIEscent Magnified galaxies at high redshift. or the REQUIEM survey, are inconsistent with what astronomers expect of the early Universe.

It was expected that the early universe would have lots of that cold hydrogen for making stars. For some galaxies to lack that gas is inexplicable, and raises questions about the assumptions inherent in the theory of the Big Bang. It doesn’t disprove it, it simply makes it harder to fit the facts to the theory, suggesting — as is always the case — that the reality is far more complicated than the theories of scientists.

Scientists: Clay, not liquid water, explains radar data under Martian south icecap

The uncertainty of science: In a new paper scientists claim that clay materials, not liquid water, better explain the radar data obtained by orbital satellites, initially hypothesized to be liquid water lakes under Mars’ south polar icecap.

Sub-glacial lakes were first reported in 2018 and caused a big stir because of the potential for habitability on Mars. Astrobiologists and non-scientists were equally attracted to the exciting news. Now, the solution to this question, with great import to the planetary science community, may be much more mundane than bodies of water on Mars.

The strength of this new study is the diversity of techniques employed. “Our study combined theoretical modeling with laboratory measurements and remote sensing observations from The Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) instrument on NASA’s Mars Reconnaissance Orbiter. All three agreed that smectites can make the reflections and that smectites are present at the south pole of Mars. It’s the trifecta: measure the material properties, show that the material properties can explain the observation, and demonstrate that the materials are present at the site of the observation,” Smith said.

This paper is only one of several recently that has popped the balloon on the liquid lake theory. Nothing is actually proven, but the weight of evidence is definitely moving away from underground liquid water under the south pole icecap.

An astrophysicist explains cosmology’s theoretical failures

Link here. The astrophysicist, Paul Sutter, does a very nice job of outlining the conundrum that has been causing astrophysicists to tear their hair out for the past decade-plus.

In the two decades since astronomers discovered dark energy, we’ve come upon a little hitch: Measurements of the expansion rate of the universe (and so its age) from both the CMB [cosmic microwave background] and supernovas have gotten ever more precise, but they’re starting to disagree. We’re not talking much; the two methods are separated by only 10 million or 20 million years in estimating the 13.77-billion-year history of the universe. But we’re operating at such a level of precision that it’s worth talking about.

If anything, this failure for two measurements of data spanning billions of light years — which is billions in both time and space — is a perfect illustration of the uncertainty of science. Astrophysicists are trying to come up with answers based on data that is quite thin, with many gaps in knowledge, and carries with it many assumptions. It therefore is actually surprising that these two numbers agree as well as they do.

Sutter, being in the CMB camp, puts most of the blame for this failure on the uncertainty of what we know about supernovae. He could very well be right. The assumptions about supernovae used to measure the expansion rate of the universe are many. There is also a lot of gaps in our knowledge, including a full understanding of the process that produces supernovae.

Sutter however I think puts too much faith in theoretical conclusions of the astrophysics community that have determined the age of the universe based on the CMB. The uncertainties here are as great. Good scientists should remain skeptical of this as well. Our knowledge of physics is still incomplete. Physicists really don’t know all the answers, yet.

In the end, Sutter however does pin down the biggest problem in cosmology:

The “crisis” is a good excuse to keep writing papers, because we’ve been stumped by dark energy for over two decades, with a lot of work and not much understanding. In a sense, many cosmologists want to keep the crisis going, because as long as it exists, they have something to talk about other than counting down the years to the next big mission.

In other words, the discussion now is sometimes less about science and theories and cosmology, but instead about funding and career promotion. What a shock!

Scientists successfully predict resumption of bursts from magnetar

The uncertainty of science: Though they have no real idea why it happens, scientists have now successfully predicted the resumption of energetic bursts coming from a magnetar and according to schedule.

The researchers — Grossan and theoretical physicist and cosmologist Eric Linder from SSL and the Berkeley Center for Cosmological Physics and postdoctoral fellow Mikhail Denissenya from Nazarbayev University in Kazakhstan — discovered the pattern last year in bursts from a soft gamma repeater, SGR1935+2154, that is a magnetar, a prolific source of soft or lower energy gamma ray bursts and the only known source of fast radio bursts within our Milky Way galaxy. They found that the object emits bursts randomly, but only within regular four-month windows of time, each active window separated by three months of inactivity.

On March 19, the team uploaded a preprint claiming “periodic windowed behavior” in soft gamma bursts from SGR1935+2154 and predicted that these bursts would start up again after June 1 — following a three month hiatus — and could occur throughout a four-month window ending Oct. 7.

On June 24, three weeks into the window of activity, the first new burst from SGR1935+2154 was observed after the predicted three month gap, and nearly a dozen more bursts have been observed since, including one on July 6.

They made this prediction based on data going back to 2014 that showed the three-month-off/four-month-on pattern.

As to why this pattern exists, they presently have no idea. Theories have been proposed, such as starquakes activated by the magnetar’s fast rotation or blocking clouds of gas, but none are really very convincing, or are backed with enough data.

Gravitational wave detectors see two different black holes as they swallowed a neutron star

Astronomers using three different gravitational wave detectors have seen the gravity ripples caused when two different black holes swallowed a nearby neutron star.

The two gravitational-wave events, dubbed GW200105 and GW200115, rippled through detectors only 10 days apart, on January 5, 2020, and January 15, 2020, respectively.

Each merger involved a fairly small black hole (less than 10 Suns in heft) paired with an object between 1½ and 2 solar masses — right in the expected range for neutron stars. Observers caught no glow from the collisions, but given that both crashes happened roughly 900 million light-years away, spotting a flash was improbable, even if one happened — and it likely didn’t: The black holes are large enough that they would have gobbled the neutron stars whole instead of ripping them into bite-size pieces.

Note the time between the detection, in early 2020, and its announcement now, in mid-2021. The data is very complex and filled with a lot of noise, requiring many months of analysis to determine if a detection was made. For example, in a third case one detector was thought to have seen another such merger but scientists remain unsure. It might simply be noise in the system. I point this out to emphasize that thought they are much more confident in these new detections, there remains some uncertainty.

New data confirms lack of dark matter in one galaxy

The uncertainty of science: Astronomers have strengthened their evidence that one particular nearby galaxy is completely devoid of dark matter, a situation that challenges the existing theories about dark matter which suggest it comprises the bulk of all matter in the universe.

The astronomers had made their first claim that this galaxy, NGC 1052-DF2, lacked dark matter back in 2018, a claim that was strongly disputed by others.

The claim however would only hold up if the galaxy’s distance from Earth was as far away as they then estimated, 65 million light years (not the 42 million light years estimated by others). If it were closer, as other scientists insisted, then NCC 1052-DF2 likely did have dark matter, and the theorists could sleep at night knowing that their theory about dark matter was right.

To test their claim, the astronomers used the Hubble Space Telescope to get a better, more tightly constrained estimate of the distance, and discovered the galaxy was even farther away then previously believed.

Team member Zili Shen, from Yale University, says that the new Hubble observations help them confirm that DF2 is not only farther from Earth than some astronomers suggest, but also slightly more distant than the team’s original estimates.

The new distance estimate is that DF2 is 72 million light-years as opposed to 42 million light-years, as reported by other independent teams. This places the galaxy farther than the original Hubble 2018 estimate of 65 light-years distance.

So, does this discovery invalidate the theories about dark matter? Yes and no. The theories now have to account for the existence of galaxies with no dark matter. Previously it was assumed that dark matter was to be found as blobs at the locations of all galaxies. Apparently it is not.

However, the lack of dark matter at this one galaxy does not prove that dark matter is not real. As noted by the lead astronomer in this research,

“In our 2018 paper, we suggested that if you have a galaxy without dark matter, and other similar galaxies seem to have it, that means that dark matter is actually real and it exists,” van Dokkum said. “It’s not a mirage.

Ah, the uncertainty of science. Isn’t it wonderful?

New data suggests muon is more magnetic that predicted

The uncertainty of science: New data now suggests that the subatomic particle called the muon is slightly more magnetic that predicted by the standard model of particle physics, a result that if confirmed will require a major rethinking of that standard model.

In 2001, researchers with the Muon g-2 experiment, then at Brookhaven, reported that the muon was a touch more magnetic than the standard model predicts. The discrepancy was only about 2.5 times the combined theoretical and experimental uncertainties. That’s nowhere near physicists’ standard for claiming a discovery: 5 times the total uncertainty. But it was a tantalizing hint of new particles just beyond their grasp.

So in 2013, researchers hauled the experiment to Fermi National Accelerator Laboratory (Fermilab) in Illinois, where they could get purer beams of muons. By the time the revamped experiment started to take data in 2018, the standard model predictions of the muon’s magnetism had improved and the difference between the experimental results and theory had risen to 3.7 times the total uncertainty.

Now, the g-2 team has released the first result from the revamped experiment, using 1 year’s worth of data. And the new result agrees almost exactly with the old one, the team announced today at a symposium at Fermilab. The concordance shows the old result was neither a statistical fluke nor the product of some undetected flaw in the experiment, says Chris Polly, a Fermilab physicist and co-spokesperson for the g-2 team. “Because I was a graduate student on the Brookhaven experiment, it was certainly an overwhelming sense of relief for me,” he says.

Together, the new and old results widen the disagreement with the standard model prediction to 4.2 times the experimental and theoretical errors.

That result is still not five times what theory predicts — the faux standard physicists apparently use to separate a simple margin of error and a true discovery — but it is almost that high, has been found consistently in repeated tests, and appears to be an unexplained discrepancy.

Not that I take any of this too seriously. If you read the entire article, you will understand. There are so many areas of uncertainty, both in the data and in the theories that this research is founded on, that the wise course is to treat it all with a great deal of skepticism. For example, the anomaly reported involves only 2.5 parts in 1 billion. While this data is definitely telling us something, but it is so close to the edge of infinitesimal that one shouldn’t trust it deeply.

Scientists: Mars is losing water seasonally through its atmosphere

The uncertainty of science: Two new studies using data Europe’s Trace Gas Orbiter and Mars Express orbiters have found that Mars is losing water seasonally through its atmosphere.

The studies also found that global dust storms accelerate the process.

Anna and colleagues found that water vapour remained confined to below 60 km when Mars was far from the Sun but extended up to 90 km in altitude when Mars was closest to the Sun. Across a full orbit, the distance between the Sun and the Red Planet ranges from 207 million to 249 million km.

Near the Sun, the warmer temperatures and more intensive circulation in the atmosphere prevented water from freezing out at a certain altitude. “Then, the upper atmosphere becomes moistened and saturated with water, explaining why water escape rates speed up during this season – water is carried higher, aiding its escape to space,” adds Anna.

In years when Mars experienced a global dust storm the upper atmosphere became even wetter, accumulating water in excess at altitudes of over 80 km.

But wait, didn’t planetary scientists just announce that Mars hasn’t lost its water through the atmosphere, but instead lost it when it became chemical trapped in the planet’s soil? Yup, they did, but that was a model based on new ground data. This new result is based on atmospheric data.

Or to put it another way, the model was incomplete. While it could be true that a large bulk of Mars’ water is trapped chemically in the ground, that is not proven, only hypothesized. What has been proven, and is now confirmed by these two studies, is that, depending on weather and season, the water of Mars does leak into its upper atmosphere where it can escape into space, never to return.

What remains unknown is how much water escaped into space, and when. Moreover, the ground-based model could still be right, even if it is true that Mars is losing water through its atmosphere. At the moment the data is too incomplete to answer these questions with any certainty.

Meanwhile, this press release once again gives the false impression that the only water left on Mars is at its poles (and in this case, only the south pole). This is not accurate, based on numerous studies finding evidence of buried ice and glaciers everywhere on the planet down to the 30th latitude, in both the north and south hemispheres. Mars might have far less water now than it did billions of years ago, but it still has plenty, and that water is not found only at the poles.

New analysis: It wasn’t even phosphine detected at Venus

The uncertainty of science: A new analysis of the data used by scientists who claimed in September that they had detected phosphine in the atmosphere of Venus has concluded that it wasn’t phosphine at all but sulfur dioxide, a chemical compound long known to be prevalent there.

The UW-led team shows that sulfur dioxide, at levels plausible for Venus, can not only explain the observations but is also more consistent with what astronomers know of the planet’s atmosphere and its punishing chemical environment, which includes clouds of sulfuric acid. In addition, the researchers show that the initial signal originated not in the planet’s cloud layer, but far above it, in an upper layer of Venus’ atmosphere where phosphine molecules would be destroyed within seconds. This lends more support to the hypothesis that sulfur dioxide produced the signal.

When the first announcement was made, it was also noted as an aside that phosphine on Earth is only found in connection with life processes, thus suggesting wildly that it might signal the existence of life on Venus.

That claim was always unjustified, especially because we know so little about Venus’s atmosphere and its alien composition. Even if there was phosphine there, to assume it came from life is a leap well beyond reasonable scientific theorizing.

It now appears that the phosphine detection itself was questionable, which is not surprising since the detection was about 20 molecules out of a billion. And while this new analysis might be correct, but what it really does is illustrate how tentative our knowledge of Venus remains. It might be right, but it also could be wrong and the original results correct. There is simply too much uncertainty and gaps in our knowledge to come to any firm and confident conclusions.

None of that mattered with our modern press corps, which ran like mad to tout the discovery of life on Venus. As I wrote quite correctly in September in my original post about the first results,

The worst part of this is that we can expect our brainless media to run with these claims, without the slightest effort of incredulity.

We live in a world of make believe and made-up science. Data is no longer important, only the leaps of fantasy we can jump to based on the slimmest of facts. It was this desire to push theories rather than knowledge that locked humanity into a dark age for centuries during the Middle Ages. It is doing it again, now, and the proof is all around you, people like zombies and sheep, wearing masks based not on any proven science but on pure emotions.

Sunspot update: December sunspot activity once again higher than predicted

The uncertainty of science: It is time to once again take a look at the state of the Sun’s on-going sunspot cycle. Below is NOAA’s January 1, 2021 monthly graph, documenting the Sun’s monthly sunspot activity and annotated by me to show previous solar cycle predictions.

The ramp up to solar maximum continued in December. Though there was a drop from the very high activity seen in November, the number of sunspots in December still far exceeded the prediction as indicated by the red curve.

» Read more

New data makes past nova too bright, but not bright enough to be supernova

The uncertainty of science: Astronomers, using new data from the Gemini North ground-based telescope, have found that a star that brightened in 1670 and was labeled a nova is much farther away than previously thought, which means that 1670 eruption was far too powerful for a nova, but not powerful enough to make it a supernova.

By measuring both the speed of the nebula’s expansion and how much the outermost wisps had moved during the last ten years, and accounting for the tilt of the nebula on the night sky, which had been estimated earlier by others, the team determined that CK Vulpeculae lies approximately 10,000 light-years distant from the Sun — about five times as far away as previously thought. That implies that the 1670 explosion was far brighter, releasing roughly 25 times more energy than previously estimated [4]. This much larger estimate of the amount of energy released means that whatever event caused the sudden appearance of CK Vulpeculae in 1670 was far more violent than a simple nova.

“In terms of energy released, our finding places CK Vulpeculae roughly midway between a nova and a supernova,” commented Evans. “[T]he cause — or causes — of the outbursts of this intermediate class of objects remain unknown. I think we all know what CK Vulpeculae isn’t, but no one knows what it is.”

Recent research has also suggested that the cause of the eruption was not from the interaction of a binary system with one normal star and a white dwarf, as believed for decades, but possibly a binary system with a brown dwarf, or a red giant star, or two normal stars. All are remain possible, none however have been confirmed.

Hubble sees too much infrared energy from gamma ray burst

The uncertainty of science: During a short gamma ray burst (GRB) that was observed in a distant galaxy on May, astronomers were baffled when measurements from the Hubble Space Telescope detected ten times more near infrared energy that they predict from this type of GRB.

GRBs fall into two classes. First there are the long bursts, which are thought to form from the collapse of a massive star into a black hole, resulting in a powerful supernova and GRB. Second there are the short bursts, which scientists think occur when two neutron stars merge.

The problem with this GRB is that though it was short and somewhat similar to other short GRBs across most wavelengths, in the near infrared Hubble detected far too much energy.

“These observations do not fit traditional explanations for short gamma-ray bursts,” said study leader Wen-fai Fong of Northwestern University in Evanston, Illinois.

…Fong and her team have discussed several possibilities to explain the unusual brightness that Hubble saw. While most short gamma-ray bursts probably result in a black hole, the two neutron stars that merged in this case may have combined to form a magnetar, a supermassive neutron star with a very powerful magnetic field. “You basically have these magnetic field lines that are anchored to the star that are whipping around at about a thousand times a second, and this produces a magnetized wind,” explained Laskar. “These spinning field lines extract the rotational energy of the neutron star formed in the merger, and deposit that energy into the ejecta from the blast, causing the material to glow even brighter.”

What is intriguing about their theory is that this merger of two neutron stars simply resulted in a larger neutron star, not a black hole. This new neutron star was also a magnetar and pulsar, but unlike a black hole, it was a still-visible physical object. And yet its creation in this GRB produced more energy.

When GRBs were first discovered, I was always puzzled why so many astronomers seemed to insist there must be a single explanation for them. With time, when two classes of GRBs were discovered, this assumption was then replaced with the equally puzzling insistence that only two types of events explained them.

It seemed to me that that such explosions had too many potential variables, and could easily have a wide range of causes, though all related to the destruction or merger of massive stars. As the data continues to accumulate this now appears increasingly the case.

Midnight repost: A scientist’s ten commandments

The tenth anniversary retrospective of Behind the Black continues: The post below, from September 27, 2010, reports on one of the simplest but most profound scientific papers I have ever read. Its advice is doubly needed today, especially commandment #3.

————————–
A scientist’s ten commandments

Published today on the astro-ph website, this preprint by Ignacio Ferrín of the Center for Fundamental Physics at the University of the Andes, Merida, Venezuala, is probably the shortest paper I have ever seen. I think that Dr. Ferrin will forgive me if I reprint it here in its entirety:

1. Go to your laboratory or your instrument without any pre-conceived ideas. Just register what you saw faithfully.

2. Report promptly and scientifically. Check your numbers twice before submitting.

3. Forget about predictions. They are maybe wrong.

4. Do not try to conform or find agreement with others. You may be the first to be observing a new phenomenon and you may risk missing credit for the discovery.

5. Criticism must be scientific, respectful, constructive, positive, and unbiased. Otherwise it must be done privately.

6. If you want to be respected, respect others first. Do not use insulting or humiliating words when referring to others. It is not in accord with scientific ethics.

7. Do not cheat. Cheating in science is silly. When others repeat your experiment or observation, they will find that you were wrong.

8. If you do not know or have made a mistake, admit it immediately. You may say, “I do not know but I will find out.” or “I will correct it immediately.” No scientist knows the answer to everything. By admitting it you are being honest about your knowledge and your abilities.

9. Do not appropriate or ignore other people’s work or results. Always give credit to others, however small their contribution may have been. Do not do unto others what you would not like to be done unto you.

10. Do not stray from scientific ethics.

It seems that some scientists in the climate field (Phil Jones of East Anglia University and Michael Mann of Pennsylvania State University are two that come to mind immediately) would benefit by reading and following these rules.

Midnight repost: The absolute uncertainty of climate science

The tenth anniversary retrospective of Behind the Black continues: Tonight’s repost adds more weight to yesterday’s about the uncertainty of any model predicting global warming. Rather than look at the giant gaps in our knowledge, this essay, posted on January 28, 2019, looked at the data tampering that government scientists are doing to their global temperature databases in order to make the past appear cooler and the present appear warmer.

——————————-
The absolute uncertainty of climate science

Even as the United States is being plunged right now into an epic cold spell (something that has been happening repeatedly for almost all the winters of the past decade), and politicians continue to rant about the coming doom due to global warming, none of the data allows anyone the right to make any claims about the future global climate, in any direction.

Why do I feel so certain I can make this claim of uncertainty? Because the data simply isn’t there. And where we do have it, it has been tampered with so badly it is no longer very trustworthy. This very well documented post by Tony Heller proves this reality, quite thoroughly.

First, until the late 20th century, we simply do not have good reliable climate data for the southern hemisphere. Any statement by anyone claiming to know with certainty what the global temperature was prior to 1978 (when the first Nimbus climate satellite was launched) should be treated with some skepticism. Take a look at all the graphs Heller posts, all from reputable science sources, all confirming my own essay on this subject from 2015. The only regions where temperatures were thoroughly measured prior to satellite data was in the United States, Europe, and Japan. There are scattered data points elsewhere, but not many, with none in the southern oceans. And while we do have a great deal of proxy data that provides some guidance as to the global temperature prior to the space age, strongly suggesting there was a global warm period around the year 1000 AD, and a global cold period around 1600 AD, this data also has a lot of uncertainty, so it is entirely reasonable to express some skepticism about it.

Second, the data in those well-covered regions have been tampered with extensively, and always in a manner that reinforces the theory of global warming. Actual temperature readings have been adjusted everywhere, always to cool the past and warm the present. As Heller notes,
» Read more

Midnight repost: The uncertainty of climate science

The tenth anniversary retrospective of Behind the Black continues: Tonight’s repost, from 2015, can be considered a follow-up to yesterday’s. While many global warming activists are absolutely certain the climate is warming — to the point of considering murder of their opponents a reasonable option — the actual available data is so far from certain as to be almost ludicrous.

——————————-
The uncertainty of climate science

For the past five years, I have been noting on this webpage the large uncertainties that still exist in the field of climate science. Though we have solid evidence of an increase of carbon dioxide in the atmosphere, we also have no idea what the consequences of that increase are going to be. It might cause the atmosphere to warm, or it might not. It might harm the environment, or it might instead spur plant life growth that will invigorate it instead. The data remains inconclusive. We really don’t even know if the climate is truly warming, and even if it is, whether CO2 is causing that warming.

While government scientists at NASA and NOAA are firmly in the camp that claims increasing carbon dioxide will cause worldwide disastrous global warming, their own data, when looked at coldly, reveals that they themselves don’t have sufficient information to make that claim. In fact, they don’t even have sufficient information to claim they know whether the climate is warming or cooling! My proof? Look at the graph below, produced by NOAA’s own National Centers for Environmental Information.
» Read more

Astronomers claim discovery of six exomoons

The uncertainty of science: Astronomers are now claiming they have detected evidence of the existence of six exomoons orbiting different stars with transiting exoplanets.

“These exomoon candidates are so small that they can’t be seen from their own transits. Rather, their presence is given away by their gravitational influence on their parent planet,” Wiegert said.

If an exoplanet orbits its star undisturbed, the transits it produces occur precisely at fixed intervals.

But for some exoplanets, the timing of the transits is variable, sometimes occurring several minutes early or late. Such transit timing variations – known as TTVs – indicate the gravity of another body. That could mean an exomoon or another planet in the system is? affecting the transiting planet.

What they have basically done is applied the technique used to identify exoplanet candidates when the planet does NOT transit the star (the wobble caused by gravity and indicated by spectral changes), and looked to see if they can see the same variations in these exoplanets.

This is fun stuff, but it is so uncertain as to be almost laughable. If you read the press release closely, you will discover that their work has been submitted for publication, but has not yet been even peer reviewed.

Their concept is good, but I would not pay much attention to these “results.”

Astronomers discover giant arc spanning a third of the night sky

Astronomers have discovered a giant arc of hydrogen gas near the Big Dipper that span a third of the night sky and is thought to be the leftover shockwave from a supernova.

Ultraviolet and narrowband photography have captured the thin and extremely faint trace of hydrogen gas arcing across 30°. The arc, presented at the recent virtual meeting of the American Astronomical Society, is probably the pristine shockwave expanding from a supernova that occurred some 100,000 years ago, and it’s a record-holder for its sheer size on the sky.

Andrea Bracco (University of Paris) and colleagues came upon the Ursa Major Arc serendipitously when looking through the ultraviolet images archived by NASA’s Galaxy Evolution Explorer (GALEX). They were looking for signs of a straight, 2° filament that had been observed two decades ago — but they found out that that length of gas was less straight than they thought, forming instead a small piece of a much larger whole.

This is a great illustration of the uncertainty of science. Earlier observations spotted only 2 degrees of this arc, and thus thought it was a straight filament. Newer more sophisticated observations show that this first conclusion was in error, that it was much bigger, and curved.

I wonder what even more and better observations would reveal.

Rethinking the theories that explain some supernovae

The uncertainty of science: New data now suggests that the previous consensus among astronomers that type Ia supernovae were caused by the interaction of a large red giant star with a white dwarf might be wrong, and that instead the explosion might be triggered by two white dwarfs.

If this new origin theory turns out to be correct, then it might also throw a big wrench into the theory of dark energy.

The evidence that twin white dwarfs drive most, if not all, type Ia supernovae, which account for about 20% of the supernova blasts in the Milky Way, “is more and more overwhelming,” says Dan Maoz, director of Tel Aviv University’s Wise Observatory, which tracks fast-changing phenomena such as supernovae. He says the classic scenario of a white dwarf paired with a large star such as a red giant “doesn’t happen in nature, or quite rarely.”

Which picture prevails has impacts across astronomy: Type Ia supernovae play a vital role in cosmic chemical manufacturing, forging in their fireballs most of the iron and other metals that pervade the universe. The explosions also serve as “standard candles,” assumed to shine with a predictable brightness. Their brightness as seen from Earth provides a cosmic yardstick, used among other things to discover “dark energy,” the unknown force that is accelerating the expansion of the universe. If type Ia supernovae originate as paired white dwarfs, their brightness might not be as consistent as was thought—and they might be less reliable as standard candles.

If type Ia supernovae are not reliable standard candles, then the entire Nobel Prize results that discovered dark energy in the late 1990s are junk, the evidence used to discover it simply unreliable. Dark energy might simply not exist.

What galls me about this possibility is that it was always the case. The certainty in the 1990s about using type Ia supernovae as a standard candle to determine distance was entirely unjustified. Even now astronomers do not really know what causes these explosions. To even consider them to always exhibit the same energy release was just not reasonable.

And yet astronomers in the 1990s did, and thus they fostered the theory of dark energy upon us — that the universe’s expansion was accelerating over vast distances — while winning Nobel Prizes. They still might be right, and dark energy might exist, but it was never very certain, and still is not.

Much of the fault in this does not lie with the astronomers, but with the press, which always likes to sell new theories as a certainty, scoffing over the doubts and areas of ignorance that make the theories questionable. This is just one more example of this, of which I can cite many examples, the worst of all being the reporting about global warming.

Universe’s expansion rate found to differ in different directions

The uncertainty of science: Using data from two space telescopes, astronomers have found that the universe’s expansion rate appears to differ depending on the direction you look.

This latest test uses a powerful, novel and independent technique. It capitalizes on the relationship between the temperature of the hot gas pervading a galaxy cluster and the amount of X-rays it produces, known as the cluster’s X-ray luminosity. The higher the temperature of the gas in a cluster, the higher the X-ray luminosity is. Once the temperature of the cluster gas is measured, the X-ray luminosity can be estimated. This method is independent of cosmological quantities, including the expansion speed of the universe.

Once they estimated the X-ray luminosities of their clusters using this technique, scientists then calculated luminosities using a different method that does depend on cosmological quantities, including the universe’s expansion speed. The results gave the researchers apparent expansion speeds across the whole sky — revealing that the universe appears to be moving away from us faster in some directions than others.

The team also compared this work with studies from other groups that have found indications of a lack of isotropy using different techniques. They found good agreement on the direction of the lowest expansion rate.

More information here.

The other research mentioned in the last paragraph in the quote above describes results posted here in December. For some reason that research did not get the publicity of today’s research, possibly because it had not yet been confirmed by others. It now has.

What this research tells us, most of all, is that dark energy, the mysterious force that is theorized to cause the universe’s expansion rate to accelerate — not slow down as you would expect– might not exist.

Update: I’ve decided to embed, below the fold, the very clear explanatory video made by one of the scientists doing that other research. Very helpful in explaining this very knotty science.

Animal life thriving in Fukushima radioactive zone

The uncertainty of science: Despite fears that the radioactivity released from the Fukushima nuclear accident would make life difficult if not impossible within the 80-mile radius exclusion zone surrounding the reactor, animals are thriving there, in large unexpected numbers.

Now, nearly a decade after the nuclear accident, the wildlife populations appear to be thriving. Animals are most abundant in areas still devoid of humans, with more than 20 species captured in the UGA’s camera study.

Particular species that often find themselves in conflict with humans, especially Fukushima’s wild boar, were most often photographed in human-evacuated areas. Without the threat of humankind, wildlife is flourishing. In the years since the nuclear accident, Japan’s wild boar seems to have taken over abandoned farmland — even moving into abandoned homes. The government hired boar hunters to cull the population prior to re-opening parts of the original exclusion zone in 2017.

This phenomenon has happened before. Life inside the Chernobyl exclusion zone in Ukraine became an accidental wildlife preserve after humans left following the nuclear disaster there in April 1986. [emphasis mine]

This story, and that of Chernobyl, does not prove that radioactivity is harmless. Not at all. What it shows is that we know diddly-squat about its effects on life. For example, one study has shown changes in the weight and size of one species of monkey at Fukushima has shrunk, but have flourished nonetheless.

New evidence: dark energy might not exist

The uncertainty of science: New evidence once again suggests that the assumptions that resulted in the invention of dark energy in the late 1990s might have been in error, and that dark energy simply might not exist.

New observations and analysis made by a team of astronomers at Yonsei University (Seoul, South Korea), together with their collaborators at Lyon University and KASI, show, however, that this key assumption is most likely in error. The team has performed very high-quality (signal-to-noise ratio ~175) spectroscopic observations to cover most of the reported nearby early-type host galaxies of SN Ia, from which they obtained the most direct and reliable measurements of population ages for these host galaxies. They find a significant correlation between SN luminosity and stellar population age at a 99.5% confidence level. As such, this is the most direct and stringent test ever made for the luminosity evolution of SN Ia. Since SN progenitors in host galaxies are getting younger with redshift (look-back time), this result inevitably indicates a serious systematic bias with redshift in SN cosmology. Taken at face values, the luminosity evolution of SN is significant enough to question the very existence of dark energy. When the luminosity evolution of SN is properly taken into account, the team found that the evidence for the existence of dark energy simply goes away.

…Other cosmological probes, such as CMB (Cosmic Microwave Background) and BAO (Baryonic Acoustic Oscillations), are also known to provide some indirect and “circumstantial” evidence for dark energy, but it was recently suggested that CMB from Planck mission no longer supports the concordance cosmological model which may require new physics. Some investigators have also shown that BAO and other low-redshift cosmological probes can be consistent with a non-accelerating universe without dark energy. In this respect, the present result showing the luminosity evolution mimicking dark energy in SN cosmology is crucial and is very timely.

There was also this story from early December, also raising questions about the existence of dark energy.

Bottom line: The data that suggested dark energy’s existence was always shallow with many assumptions and large margins of uncertainty. This research only underlines that fact, a fact that many cosmologists have frequently tried to sweep under the rug.

Dark energy still might exist, but it behooves scientists to look coldly at the data and always recognize its weaknesses. It appears in terms of dark energy the cosomological community is finally beginning to do so.

New analysis suggests dark energy might not be necessary

The uncertainty of science: A new peer-reviewed paper in a major astronomy science journal suggests that dark energy might not actually exist, and that the evidence for it might simply be because the original data was biased by the Milky Way’s own movement.

What [the scientists in this new paper] found is that the best fit to the data is that the redshift of supernovae is not the same in all directions, but that it depends on the direction. This direction is aligned with the direction in which we move through the cosmic microwave background. And – most importantly – you do not need further redshift to explain the observations.

If what they say is correct, then it is unnecessary to postulate dark energy which means that the expansion of the universe might not speed up after all.

Why didn’t Perlmutter and Riess [the discoverers of dark energy] come to this conclusion? They could not, because the supernovae that they looked were skewed in direction. The ones with low redshift were in the direction of the CMB dipole; and high redshift ones away from it. With a skewed sample like this, you can’t tell if the effect you see is the same in all directions.

The link is to a blog post by a physicist in the field, commenting on the new paper. Below the fold I have embedded a video from that same physicist that does a nice job of illustrating what she wrote.

This paper does not disprove dark energy. It instead illustrates the large uncertainties involved, as well as show solid evidence that the present consensus favoring the existence of dark energy should be questioned.

But then, that’s how real science works. When the data is sketchy or thin, with many assumptions, it is essential that everyone, especially the scientists in the field, question the results. We shall see now if the physics community will do this.

Hat tip to reader Mike Nelson.

» Read more

Astronomers find 19 more galaxies showing lack of dark matter

The uncertainty of science: Astronomers have discovered 19 more dwarf galaxies, now totaling 23, that appear to have significant deficits of dark matter.

Of 324 dwarf galaxies analyzed, 19 appear to be missing similarly large stores of dark matter. Those 19 are all within about 500 million light-years of Earth, and five are in or near other groups of galaxies. In those cases, the researchers note, perhaps their galactic neighbors have somehow siphoned off their dark matter. But the remaining 14 are far from other galaxies. Either these oddballs were born different, or some internal machinations such as exploding stars have upset their balance of dark matter and everyday matter, or baryons.

It may not be a case of missing dark matter, says James Bullock, an astrophysicist at the University of California, Irvine. Instead, maybe these dwarf galaxies have clung to their normal matter — or even stolen some — and so “have too many baryons.” Either way, he says, “this is telling us something about the diversity of galaxy formation…. Exactly what that’s telling us, that’s the trick.”

Since we do not know what dark matter is to begin with, finding galaxies lacking it only makes more difficult to create a theory to explain it. Something causes most galaxies to rotate faster than they should, based on their visible mass. What that is remains an unknown.

New estimate for Hubble constant differs from previous and also conflicting results

The uncertainty of science: Using microlensing effects scientists have measured a new estimate for the Hubble constant, the rate in which the universe is expanding, and have come up with a number that is different from previous results.

Using adaptive optics technology on the W.M. Keck telescopes in Hawaii, they arrived at an estimate of 76.8 kilometers per second per megaparsec. As a parsec is a bit over 30 trillion kilometers and a megaparsec is a million parsecs, that is an excruciatingly precise measurement. In 2017, the H0LICOW team published an estimate of 71.9, using the same method and data from the Hubble Space Telescope.

The new SHARP/H0LICOW estimates are comparable to that by a team led by Adam Reiss of Johns Hopkins University, 74.03, using measurements of a set of variable stars called the Cepheids. But it’s quite a lot different from estimates of the Hubble constant from an entirely different technique based on the cosmic microwave background. That method, based on the afterglow of the Big Bang, gives a Hubble constant of 67.4, assuming the standard cosmological model of the universe is correct.

An estimate by Wendy Freedman and colleagues at the University of Chicago comes close to bridging the gap, with a Hubble constant of 69.8 based on the luminosity of distant red giant stars and supernovae.

So five different teams have come up with five different numbers, ranging from 67.4 to 76.8 kilometers per second per megaparsec. Based on the present understanding of cosmology, however, the range should have been far less. By now the physicists had expected these different results to be close to the same. The differences suggest that either their theories are wrong, or their methods of measurement are incorrect.

The most likely explanation is that we presently have too little knowledge about the early universe to form any solid theories. These measurements are based on a very tiny amount of data that also require a lot of assumptions.

New data cuts neutrino mass in half

The uncertainty of science: New data now suggests that the highest mass possible for the neutrino is about half the previous estimates.

At the 2019 Topics in Astroparticle and Underground Physics conference in Toyama, Japan, leaders from the KATRIN experiment reported Sept. 13 that the estimated range for the rest mass of the neutrino is no larger than about 1 electron volt, or eV. These inaugural results obtained earlier this year by the Karlsruhe Tritium Neutrino experiment — or KATRIN — cut the mass range for the neutrino by more than half by lowering the upper limit of the neutrino’s mass from 2 eV to about 1 eV. The lower limit for the neutrino mass, 0.02 eV, was set by previous experiments by other groups.

This lower limit does not tell us what the neutrino actually weighs, only reduces the uncertainty of the range of possible masses.

Two new science papers strongly question theory of man-made global warming

The uncertainty of science: Two new science papers, from researchers in Finland and Japan respectively, both strongly question the theory that human activity and the increase of carbon dioxide are causing global warming.

From the Finnish paper’s [pdf] conclusion:

We have proven that the [climate]-models used in IPCC report AR5 cannot compute correctly the natural component included in the observed global temperature. The reason is that the models fail to derive the influences of low cloud cover fraction on the global temperature. A too small natural component results in a too large portion for the contribution of the greenhouse gases like carbon dioxide. That is why 6 J. KAUPPINEN AND P. MALMI IPCC represents the climate sensitivity more than one order of magnitude larger than our sensitivity 0.24°C. Because the anthropogenic portion in the increased CO2 is less than 10 %, we have practically no anthropogenic climate change. The low clouds control mainly the global temperature. [emphasis mine]

From the Japanese paper:

“The Intergovernmental Panel on Climate Change (IPCC) has discussed the impact of cloud cover on climate in their evaluations, but this phenomenon has never been considered in climate predictions due to the insufficient physical understanding of it,” comments Professor Hyodo. “This study provides an opportunity to rethink the impact of clouds on climate. When galactic cosmic rays increase, so do low clouds, and when cosmic rays decrease clouds do as well, so climate warming may be caused by an opposite-umbrella effect. The umbrella effect caused by galactic cosmic rays is important when thinking about current global warming as well as the warm period of the medieval era.”

Essentially, both criticize the climate models for not considering changes in cloud cover and how those effect the global climate. The first paper looks back at the known climate data and compares it with known changes in cloud cover, and finds that cloud cover is a major factor in temperature changes.

The second paper looks at the causes for some of the changes in cloud cover, noting how the increase in galactic cosmic rays during the solar minimum can be tied to an increase in cloud cover, and thus colder temperatures.

Do these papers disprove man-made global warming caused by the increase in carbon dioxide in the atmosphere? Of course not. They just demonstrate again that the science here is very unsettled, that there are many large gaps in our knowledge, and that it would be foolish now to abandon western civilization and replace it with socialist totalitarian rule in order to prevent a disaster that either might not be happening, or if it is we may have no power to control.

I want to also point out that this post talks about scientists challenging the theory of man-made global warming. Attention must be paid to their conclusions. As for the ignorant opinions of politicians on this subject, who cares?

Nearly 400 medical procedures found to be ineffective

The uncertainty of science: A new review of the science literature has found almost 400 studies showing the ineffectiveness of the medical procedure or device they were studying.

The findings are based on more than 15 years of randomised controlled trials, a type of research that aims to reduce bias when testing new treatments. Across 3,000 articles in three leading medical journals from the UK and the US, the authors found 396 reversals.

While these were found in every medical discipline, cardiovascular disease was by far the most commonly represented category, at 20 percent; it was followed by preventative medicine and critical care. Taken together, it appears that medication was the most common reversal at 33 percent; procedures came in second at 20 percent, and vitamins and supplements came in third at 13 percent.

A reversal means that the study found the procedure, device, or medicine to be ineffective.

If you have medical issues it is worth reviewing the research itself. You might find that some of the medical treatment you are getting is irrelevant, and could be discontinued.

New analysis throws wrench in formation theory of spirals in galaxies

The uncertainty of science: A new analysis of over 6000 galaxies suggests that a long-held model for the formation of spirals in galaxies is wrong.

[Edwin] Hubble’s model soon became the authoritative method of classifying spiral galaxies, and is still used widely in astronomy textbooks to this day. His key observation was that galaxies with larger bulges tended to have more tightly wound spiral arms, lending vital support to the ‘density wave’ model of spiral arm formation.

Now though, in contradiction to Hubble’s model, the new work finds no significant correlation between the sizes of the galaxy bulges and how tightly wound the spirals are. This suggests that most spirals are not static density waves after all.

Essentially, we still have no idea why spirals form in galaxies.

1 2 3