Sunspot update: December sunspot activity once again higher than predicted

The uncertainty of science: It is time to once again take a look at the state of the Sun’s on-going sunspot cycle. Below is NOAA’s January 1, 2021 monthly graph, documenting the Sun’s monthly sunspot activity and annotated by me to show previous solar cycle predictions.

The ramp up to solar maximum continued in December. Though there was a drop from the very high activity seen in November, the number of sunspots in December still far exceeded the prediction as indicated by the red curve.

» Read more

New data makes past nova too bright, but not bright enough to be supernova

The uncertainty of science: Astronomers, using new data from the Gemini North ground-based telescope, have found that a star that brightened in 1670 and was labeled a nova is much farther away than previously thought, which means that 1670 eruption was far too powerful for a nova, but not powerful enough to make it a supernova.

By measuring both the speed of the nebula’s expansion and how much the outermost wisps had moved during the last ten years, and accounting for the tilt of the nebula on the night sky, which had been estimated earlier by others, the team determined that CK Vulpeculae lies approximately 10,000 light-years distant from the Sun — about five times as far away as previously thought. That implies that the 1670 explosion was far brighter, releasing roughly 25 times more energy than previously estimated [4]. This much larger estimate of the amount of energy released means that whatever event caused the sudden appearance of CK Vulpeculae in 1670 was far more violent than a simple nova.

“In terms of energy released, our finding places CK Vulpeculae roughly midway between a nova and a supernova,” commented Evans. “[T]he cause — or causes — of the outbursts of this intermediate class of objects remain unknown. I think we all know what CK Vulpeculae isn’t, but no one knows what it is.”

Recent research has also suggested that the cause of the eruption was not from the interaction of a binary system with one normal star and a white dwarf, as believed for decades, but possibly a binary system with a brown dwarf, or a red giant star, or two normal stars. All are remain possible, none however have been confirmed.

Hubble sees too much infrared energy from gamma ray burst

The uncertainty of science: During a short gamma ray burst (GRB) that was observed in a distant galaxy on May, astronomers were baffled when measurements from the Hubble Space Telescope detected ten times more near infrared energy that they predict from this type of GRB.

GRBs fall into two classes. First there are the long bursts, which are thought to form from the collapse of a massive star into a black hole, resulting in a powerful supernova and GRB. Second there are the short bursts, which scientists think occur when two neutron stars merge.

The problem with this GRB is that though it was short and somewhat similar to other short GRBs across most wavelengths, in the near infrared Hubble detected far too much energy.

“These observations do not fit traditional explanations for short gamma-ray bursts,” said study leader Wen-fai Fong of Northwestern University in Evanston, Illinois.

…Fong and her team have discussed several possibilities to explain the unusual brightness that Hubble saw. While most short gamma-ray bursts probably result in a black hole, the two neutron stars that merged in this case may have combined to form a magnetar, a supermassive neutron star with a very powerful magnetic field. “You basically have these magnetic field lines that are anchored to the star that are whipping around at about a thousand times a second, and this produces a magnetized wind,” explained Laskar. “These spinning field lines extract the rotational energy of the neutron star formed in the merger, and deposit that energy into the ejecta from the blast, causing the material to glow even brighter.”

What is intriguing about their theory is that this merger of two neutron stars simply resulted in a larger neutron star, not a black hole. This new neutron star was also a magnetar and pulsar, but unlike a black hole, it was a still-visible physical object. And yet its creation in this GRB produced more energy.

When GRBs were first discovered, I was always puzzled why so many astronomers seemed to insist there must be a single explanation for them. With time, when two classes of GRBs were discovered, this assumption was then replaced with the equally puzzling insistence that only two types of events explained them.

It seemed to me that that such explosions had too many potential variables, and could easily have a wide range of causes, though all related to the destruction or merger of massive stars. As the data continues to accumulate this now appears increasingly the case.

Midnight repost: A scientist’s ten commandments

The tenth anniversary retrospective of Behind the Black continues: The post below, from September 27, 2010, reports on one of the simplest but most profound scientific papers I have ever read. Its advice is doubly needed today, especially commandment #3.

————————–
A scientist’s ten commandments

Published today on the astro-ph website, this preprint by Ignacio Ferrín of the Center for Fundamental Physics at the University of the Andes, Merida, Venezuala, is probably the shortest paper I have ever seen. I think that Dr. Ferrin will forgive me if I reprint it here in its entirety:

1. Go to your laboratory or your instrument without any pre-conceived ideas. Just register what you saw faithfully.

2. Report promptly and scientifically. Check your numbers twice before submitting.

3. Forget about predictions. They are maybe wrong.

4. Do not try to conform or find agreement with others. You may be the first to be observing a new phenomenon and you may risk missing credit for the discovery.

5. Criticism must be scientific, respectful, constructive, positive, and unbiased. Otherwise it must be done privately.

6. If you want to be respected, respect others first. Do not use insulting or humiliating words when referring to others. It is not in accord with scientific ethics.

7. Do not cheat. Cheating in science is silly. When others repeat your experiment or observation, they will find that you were wrong.

8. If you do not know or have made a mistake, admit it immediately. You may say, “I do not know but I will find out.” or “I will correct it immediately.” No scientist knows the answer to everything. By admitting it you are being honest about your knowledge and your abilities.

9. Do not appropriate or ignore other people’s work or results. Always give credit to others, however small their contribution may have been. Do not do unto others what you would not like to be done unto you.

10. Do not stray from scientific ethics.

It seems that some scientists in the climate field (Phil Jones of East Anglia University and Michael Mann of Pennsylvania State University are two that come to mind immediately) would benefit by reading and following these rules.

Midnight repost: The absolute uncertainty of climate science

The tenth anniversary retrospective of Behind the Black continues: Tonight’s repost adds more weight to yesterday’s about the uncertainty of any model predicting global warming. Rather than look at the giant gaps in our knowledge, this essay, posted on January 28, 2019, looked at the data tampering that government scientists are doing to their global temperature databases in order to make the past appear cooler and the present appear warmer.

——————————-
The absolute uncertainty of climate science

Even as the United States is being plunged right now into an epic cold spell (something that has been happening repeatedly for almost all the winters of the past decade), and politicians continue to rant about the coming doom due to global warming, none of the data allows anyone the right to make any claims about the future global climate, in any direction.

Why do I feel so certain I can make this claim of uncertainty? Because the data simply isn’t there. And where we do have it, it has been tampered with so badly it is no longer very trustworthy. This very well documented post by Tony Heller proves this reality, quite thoroughly.

First, until the late 20th century, we simply do not have good reliable climate data for the southern hemisphere. Any statement by anyone claiming to know with certainty what the global temperature was prior to 1978 (when the first Nimbus climate satellite was launched) should be treated with some skepticism. Take a look at all the graphs Heller posts, all from reputable science sources, all confirming my own essay on this subject from 2015. The only regions where temperatures were thoroughly measured prior to satellite data was in the United States, Europe, and Japan. There are scattered data points elsewhere, but not many, with none in the southern oceans. And while we do have a great deal of proxy data that provides some guidance as to the global temperature prior to the space age, strongly suggesting there was a global warm period around the year 1000 AD, and a global cold period around 1600 AD, this data also has a lot of uncertainty, so it is entirely reasonable to express some skepticism about it.

Second, the data in those well-covered regions have been tampered with extensively, and always in a manner that reinforces the theory of global warming. Actual temperature readings have been adjusted everywhere, always to cool the past and warm the present. As Heller notes,
» Read more

Midnight repost: The uncertainty of climate science

The tenth anniversary retrospective of Behind the Black continues: Tonight’s repost, from 2015, can be considered a follow-up to yesterday’s. While many global warming activists are absolutely certain the climate is warming — to the point of considering murder of their opponents a reasonable option — the actual available data is so far from certain as to be almost ludicrous.

——————————-
The uncertainty of climate science

For the past five years, I have been noting on this webpage the large uncertainties that still exist in the field of climate science. Though we have solid evidence of an increase of carbon dioxide in the atmosphere, we also have no idea what the consequences of that increase are going to be. It might cause the atmosphere to warm, or it might not. It might harm the environment, or it might instead spur plant life growth that will invigorate it instead. The data remains inconclusive. We really don’t even know if the climate is truly warming, and even if it is, whether CO2 is causing that warming.

While government scientists at NASA and NOAA are firmly in the camp that claims increasing carbon dioxide will cause worldwide disastrous global warming, their own data, when looked at coldly, reveals that they themselves don’t have sufficient information to make that claim. In fact, they don’t even have sufficient information to claim they know whether the climate is warming or cooling! My proof? Look at the graph below, produced by NOAA’s own National Centers for Environmental Information.
» Read more

Astronomers claim discovery of six exomoons

The uncertainty of science: Astronomers are now claiming they have detected evidence of the existence of six exomoons orbiting different stars with transiting exoplanets.

“These exomoon candidates are so small that they can’t be seen from their own transits. Rather, their presence is given away by their gravitational influence on their parent planet,” Wiegert said.

If an exoplanet orbits its star undisturbed, the transits it produces occur precisely at fixed intervals.

But for some exoplanets, the timing of the transits is variable, sometimes occurring several minutes early or late. Such transit timing variations – known as TTVs – indicate the gravity of another body. That could mean an exomoon or another planet in the system is? affecting the transiting planet.

What they have basically done is applied the technique used to identify exoplanet candidates when the planet does NOT transit the star (the wobble caused by gravity and indicated by spectral changes), and looked to see if they can see the same variations in these exoplanets.

This is fun stuff, but it is so uncertain as to be almost laughable. If you read the press release closely, you will discover that their work has been submitted for publication, but has not yet been even peer reviewed.

Their concept is good, but I would not pay much attention to these “results.”

Astronomers discover giant arc spanning a third of the night sky

Astronomers have discovered a giant arc of hydrogen gas near the Big Dipper that span a third of the night sky and is thought to be the leftover shockwave from a supernova.

Ultraviolet and narrowband photography have captured the thin and extremely faint trace of hydrogen gas arcing across 30°. The arc, presented at the recent virtual meeting of the American Astronomical Society, is probably the pristine shockwave expanding from a supernova that occurred some 100,000 years ago, and it’s a record-holder for its sheer size on the sky.

Andrea Bracco (University of Paris) and colleagues came upon the Ursa Major Arc serendipitously when looking through the ultraviolet images archived by NASA’s Galaxy Evolution Explorer (GALEX). They were looking for signs of a straight, 2° filament that had been observed two decades ago — but they found out that that length of gas was less straight than they thought, forming instead a small piece of a much larger whole.

This is a great illustration of the uncertainty of science. Earlier observations spotted only 2 degrees of this arc, and thus thought it was a straight filament. Newer more sophisticated observations show that this first conclusion was in error, that it was much bigger, and curved.

I wonder what even more and better observations would reveal.

Rethinking the theories that explain some supernovae

The uncertainty of science: New data now suggests that the previous consensus among astronomers that type Ia supernovae were caused by the interaction of a large red giant star with a white dwarf might be wrong, and that instead the explosion might be triggered by two white dwarfs.

If this new origin theory turns out to be correct, then it might also throw a big wrench into the theory of dark energy.

The evidence that twin white dwarfs drive most, if not all, type Ia supernovae, which account for about 20% of the supernova blasts in the Milky Way, “is more and more overwhelming,” says Dan Maoz, director of Tel Aviv University’s Wise Observatory, which tracks fast-changing phenomena such as supernovae. He says the classic scenario of a white dwarf paired with a large star such as a red giant “doesn’t happen in nature, or quite rarely.”

Which picture prevails has impacts across astronomy: Type Ia supernovae play a vital role in cosmic chemical manufacturing, forging in their fireballs most of the iron and other metals that pervade the universe. The explosions also serve as “standard candles,” assumed to shine with a predictable brightness. Their brightness as seen from Earth provides a cosmic yardstick, used among other things to discover “dark energy,” the unknown force that is accelerating the expansion of the universe. If type Ia supernovae originate as paired white dwarfs, their brightness might not be as consistent as was thought—and they might be less reliable as standard candles.

If type Ia supernovae are not reliable standard candles, then the entire Nobel Prize results that discovered dark energy in the late 1990s are junk, the evidence used to discover it simply unreliable. Dark energy might simply not exist.

What galls me about this possibility is that it was always the case. The certainty in the 1990s about using type Ia supernovae as a standard candle to determine distance was entirely unjustified. Even now astronomers do not really know what causes these explosions. To even consider them to always exhibit the same energy release was just not reasonable.

And yet astronomers in the 1990s did, and thus they fostered the theory of dark energy upon us — that the universe’s expansion was accelerating over vast distances — while winning Nobel Prizes. They still might be right, and dark energy might exist, but it was never very certain, and still is not.

Much of the fault in this does not lie with the astronomers, but with the press, which always likes to sell new theories as a certainty, scoffing over the doubts and areas of ignorance that make the theories questionable. This is just one more example of this, of which I can cite many examples, the worst of all being the reporting about global warming.

Universe’s expansion rate found to differ in different directions

The uncertainty of science: Using data from two space telescopes, astronomers have found that the universe’s expansion rate appears to differ depending on the direction you look.

This latest test uses a powerful, novel and independent technique. It capitalizes on the relationship between the temperature of the hot gas pervading a galaxy cluster and the amount of X-rays it produces, known as the cluster’s X-ray luminosity. The higher the temperature of the gas in a cluster, the higher the X-ray luminosity is. Once the temperature of the cluster gas is measured, the X-ray luminosity can be estimated. This method is independent of cosmological quantities, including the expansion speed of the universe.

Once they estimated the X-ray luminosities of their clusters using this technique, scientists then calculated luminosities using a different method that does depend on cosmological quantities, including the universe’s expansion speed. The results gave the researchers apparent expansion speeds across the whole sky — revealing that the universe appears to be moving away from us faster in some directions than others.

The team also compared this work with studies from other groups that have found indications of a lack of isotropy using different techniques. They found good agreement on the direction of the lowest expansion rate.

More information here.

The other research mentioned in the last paragraph in the quote above describes results posted here in December. For some reason that research did not get the publicity of today’s research, possibly because it had not yet been confirmed by others. It now has.

What this research tells us, most of all, is that dark energy, the mysterious force that is theorized to cause the universe’s expansion rate to accelerate — not slow down as you would expect– might not exist.

Update: I’ve decided to embed, below the fold, the very clear explanatory video made by one of the scientists doing that other research. Very helpful in explaining this very knotty science.

Animal life thriving in Fukushima radioactive zone

The uncertainty of science: Despite fears that the radioactivity released from the Fukushima nuclear accident would make life difficult if not impossible within the 80-mile radius exclusion zone surrounding the reactor, animals are thriving there, in large unexpected numbers.

Now, nearly a decade after the nuclear accident, the wildlife populations appear to be thriving. Animals are most abundant in areas still devoid of humans, with more than 20 species captured in the UGA’s camera study.

Particular species that often find themselves in conflict with humans, especially Fukushima’s wild boar, were most often photographed in human-evacuated areas. Without the threat of humankind, wildlife is flourishing. In the years since the nuclear accident, Japan’s wild boar seems to have taken over abandoned farmland — even moving into abandoned homes. The government hired boar hunters to cull the population prior to re-opening parts of the original exclusion zone in 2017.

This phenomenon has happened before. Life inside the Chernobyl exclusion zone in Ukraine became an accidental wildlife preserve after humans left following the nuclear disaster there in April 1986. [emphasis mine]

This story, and that of Chernobyl, does not prove that radioactivity is harmless. Not at all. What it shows is that we know diddly-squat about its effects on life. For example, one study has shown changes in the weight and size of one species of monkey at Fukushima has shrunk, but have flourished nonetheless.

New evidence: dark energy might not exist

The uncertainty of science: New evidence once again suggests that the assumptions that resulted in the invention of dark energy in the late 1990s might have been in error, and that dark energy simply might not exist.

New observations and analysis made by a team of astronomers at Yonsei University (Seoul, South Korea), together with their collaborators at Lyon University and KASI, show, however, that this key assumption is most likely in error. The team has performed very high-quality (signal-to-noise ratio ~175) spectroscopic observations to cover most of the reported nearby early-type host galaxies of SN Ia, from which they obtained the most direct and reliable measurements of population ages for these host galaxies. They find a significant correlation between SN luminosity and stellar population age at a 99.5% confidence level. As such, this is the most direct and stringent test ever made for the luminosity evolution of SN Ia. Since SN progenitors in host galaxies are getting younger with redshift (look-back time), this result inevitably indicates a serious systematic bias with redshift in SN cosmology. Taken at face values, the luminosity evolution of SN is significant enough to question the very existence of dark energy. When the luminosity evolution of SN is properly taken into account, the team found that the evidence for the existence of dark energy simply goes away.

…Other cosmological probes, such as CMB (Cosmic Microwave Background) and BAO (Baryonic Acoustic Oscillations), are also known to provide some indirect and “circumstantial” evidence for dark energy, but it was recently suggested that CMB from Planck mission no longer supports the concordance cosmological model which may require new physics. Some investigators have also shown that BAO and other low-redshift cosmological probes can be consistent with a non-accelerating universe without dark energy. In this respect, the present result showing the luminosity evolution mimicking dark energy in SN cosmology is crucial and is very timely.

There was also this story from early December, also raising questions about the existence of dark energy.

Bottom line: The data that suggested dark energy’s existence was always shallow with many assumptions and large margins of uncertainty. This research only underlines that fact, a fact that many cosmologists have frequently tried to sweep under the rug.

Dark energy still might exist, but it behooves scientists to look coldly at the data and always recognize its weaknesses. It appears in terms of dark energy the cosomological community is finally beginning to do so.

New analysis suggests dark energy might not be necessary

The uncertainty of science: A new peer-reviewed paper in a major astronomy science journal suggests that dark energy might not actually exist, and that the evidence for it might simply be because the original data was biased by the Milky Way’s own movement.

What [the scientists in this new paper] found is that the best fit to the data is that the redshift of supernovae is not the same in all directions, but that it depends on the direction. This direction is aligned with the direction in which we move through the cosmic microwave background. And – most importantly – you do not need further redshift to explain the observations.

If what they say is correct, then it is unnecessary to postulate dark energy which means that the expansion of the universe might not speed up after all.

Why didn’t Perlmutter and Riess [the discoverers of dark energy] come to this conclusion? They could not, because the supernovae that they looked were skewed in direction. The ones with low redshift were in the direction of the CMB dipole; and high redshift ones away from it. With a skewed sample like this, you can’t tell if the effect you see is the same in all directions.

The link is to a blog post by a physicist in the field, commenting on the new paper. Below the fold I have embedded a video from that same physicist that does a nice job of illustrating what she wrote.

This paper does not disprove dark energy. It instead illustrates the large uncertainties involved, as well as show solid evidence that the present consensus favoring the existence of dark energy should be questioned.

But then, that’s how real science works. When the data is sketchy or thin, with many assumptions, it is essential that everyone, especially the scientists in the field, question the results. We shall see now if the physics community will do this.

Hat tip to reader Mike Nelson.

» Read more

Astronomers find 19 more galaxies showing lack of dark matter

The uncertainty of science: Astronomers have discovered 19 more dwarf galaxies, now totaling 23, that appear to have significant deficits of dark matter.

Of 324 dwarf galaxies analyzed, 19 appear to be missing similarly large stores of dark matter. Those 19 are all within about 500 million light-years of Earth, and five are in or near other groups of galaxies. In those cases, the researchers note, perhaps their galactic neighbors have somehow siphoned off their dark matter. But the remaining 14 are far from other galaxies. Either these oddballs were born different, or some internal machinations such as exploding stars have upset their balance of dark matter and everyday matter, or baryons.

It may not be a case of missing dark matter, says James Bullock, an astrophysicist at the University of California, Irvine. Instead, maybe these dwarf galaxies have clung to their normal matter — or even stolen some — and so “have too many baryons.” Either way, he says, “this is telling us something about the diversity of galaxy formation…. Exactly what that’s telling us, that’s the trick.”

Since we do not know what dark matter is to begin with, finding galaxies lacking it only makes more difficult to create a theory to explain it. Something causes most galaxies to rotate faster than they should, based on their visible mass. What that is remains an unknown.

New estimate for Hubble constant differs from previous and also conflicting results

The uncertainty of science: Using microlensing effects scientists have measured a new estimate for the Hubble constant, the rate in which the universe is expanding, and have come up with a number that is different from previous results.

Using adaptive optics technology on the W.M. Keck telescopes in Hawaii, they arrived at an estimate of 76.8 kilometers per second per megaparsec. As a parsec is a bit over 30 trillion kilometers and a megaparsec is a million parsecs, that is an excruciatingly precise measurement. In 2017, the H0LICOW team published an estimate of 71.9, using the same method and data from the Hubble Space Telescope.

The new SHARP/H0LICOW estimates are comparable to that by a team led by Adam Reiss of Johns Hopkins University, 74.03, using measurements of a set of variable stars called the Cepheids. But it’s quite a lot different from estimates of the Hubble constant from an entirely different technique based on the cosmic microwave background. That method, based on the afterglow of the Big Bang, gives a Hubble constant of 67.4, assuming the standard cosmological model of the universe is correct.

An estimate by Wendy Freedman and colleagues at the University of Chicago comes close to bridging the gap, with a Hubble constant of 69.8 based on the luminosity of distant red giant stars and supernovae.

So five different teams have come up with five different numbers, ranging from 67.4 to 76.8 kilometers per second per megaparsec. Based on the present understanding of cosmology, however, the range should have been far less. By now the physicists had expected these different results to be close to the same. The differences suggest that either their theories are wrong, or their methods of measurement are incorrect.

The most likely explanation is that we presently have too little knowledge about the early universe to form any solid theories. These measurements are based on a very tiny amount of data that also require a lot of assumptions.

New data cuts neutrino mass in half

The uncertainty of science: New data now suggests that the highest mass possible for the neutrino is about half the previous estimates.

At the 2019 Topics in Astroparticle and Underground Physics conference in Toyama, Japan, leaders from the KATRIN experiment reported Sept. 13 that the estimated range for the rest mass of the neutrino is no larger than about 1 electron volt, or eV. These inaugural results obtained earlier this year by the Karlsruhe Tritium Neutrino experiment — or KATRIN — cut the mass range for the neutrino by more than half by lowering the upper limit of the neutrino’s mass from 2 eV to about 1 eV. The lower limit for the neutrino mass, 0.02 eV, was set by previous experiments by other groups.

This lower limit does not tell us what the neutrino actually weighs, only reduces the uncertainty of the range of possible masses.

Two new science papers strongly question theory of man-made global warming

The uncertainty of science: Two new science papers, from researchers in Finland and Japan respectively, both strongly question the theory that human activity and the increase of carbon dioxide are causing global warming.

From the Finnish paper’s [pdf] conclusion:

We have proven that the [climate]-models used in IPCC report AR5 cannot compute correctly the natural component included in the observed global temperature. The reason is that the models fail to derive the influences of low cloud cover fraction on the global temperature. A too small natural component results in a too large portion for the contribution of the greenhouse gases like carbon dioxide. That is why 6 J. KAUPPINEN AND P. MALMI IPCC represents the climate sensitivity more than one order of magnitude larger than our sensitivity 0.24°C. Because the anthropogenic portion in the increased CO2 is less than 10 %, we have practically no anthropogenic climate change. The low clouds control mainly the global temperature. [emphasis mine]

From the Japanese paper:

“The Intergovernmental Panel on Climate Change (IPCC) has discussed the impact of cloud cover on climate in their evaluations, but this phenomenon has never been considered in climate predictions due to the insufficient physical understanding of it,” comments Professor Hyodo. “This study provides an opportunity to rethink the impact of clouds on climate. When galactic cosmic rays increase, so do low clouds, and when cosmic rays decrease clouds do as well, so climate warming may be caused by an opposite-umbrella effect. The umbrella effect caused by galactic cosmic rays is important when thinking about current global warming as well as the warm period of the medieval era.”

Essentially, both criticize the climate models for not considering changes in cloud cover and how those effect the global climate. The first paper looks back at the known climate data and compares it with known changes in cloud cover, and finds that cloud cover is a major factor in temperature changes.

The second paper looks at the causes for some of the changes in cloud cover, noting how the increase in galactic cosmic rays during the solar minimum can be tied to an increase in cloud cover, and thus colder temperatures.

Do these papers disprove man-made global warming caused by the increase in carbon dioxide in the atmosphere? Of course not. They just demonstrate again that the science here is very unsettled, that there are many large gaps in our knowledge, and that it would be foolish now to abandon western civilization and replace it with socialist totalitarian rule in order to prevent a disaster that either might not be happening, or if it is we may have no power to control.

I want to also point out that this post talks about scientists challenging the theory of man-made global warming. Attention must be paid to their conclusions. As for the ignorant opinions of politicians on this subject, who cares?

Nearly 400 medical procedures found to be ineffective

The uncertainty of science: A new review of the science literature has found almost 400 studies showing the ineffectiveness of the medical procedure or device they were studying.

The findings are based on more than 15 years of randomised controlled trials, a type of research that aims to reduce bias when testing new treatments. Across 3,000 articles in three leading medical journals from the UK and the US, the authors found 396 reversals.

While these were found in every medical discipline, cardiovascular disease was by far the most commonly represented category, at 20 percent; it was followed by preventative medicine and critical care. Taken together, it appears that medication was the most common reversal at 33 percent; procedures came in second at 20 percent, and vitamins and supplements came in third at 13 percent.

A reversal means that the study found the procedure, device, or medicine to be ineffective.

If you have medical issues it is worth reviewing the research itself. You might find that some of the medical treatment you are getting is irrelevant, and could be discontinued.

New analysis throws wrench in formation theory of spirals in galaxies

The uncertainty of science: A new analysis of over 6000 galaxies suggests that a long-held model for the formation of spirals in galaxies is wrong.

[Edwin] Hubble’s model soon became the authoritative method of classifying spiral galaxies, and is still used widely in astronomy textbooks to this day. His key observation was that galaxies with larger bulges tended to have more tightly wound spiral arms, lending vital support to the ‘density wave’ model of spiral arm formation.

Now though, in contradiction to Hubble’s model, the new work finds no significant correlation between the sizes of the galaxy bulges and how tightly wound the spirals are. This suggests that most spirals are not static density waves after all.

Essentially, we still have no idea why spirals form in galaxies.

Three exocomets found circling Beta Pictoris

The uncertainty of science: By analyzing data from the new space telescope TESS, astronomers think they have identified three exocomets orbiting the nearby star Beta Pictoris.

Why do I label this uncertain? Let the scientists themselves illustrate my doubt:

Sebastian Zieba, Master’s student in the team of Konstanze Zwintz at the Institute of Astro- and Particle Physics at the University of Innsbruck, discovered the signal of the exocomets when he investigated the TESS light curve of Beta Pictoris in March this year. “The data showed a significant decrease in the intensity of the light of the observed star. These variations due to darkening by an object in the star’s orbit can clearly be related to a comet,” Sebastian Zieba and Konstanze Zwintz explain the sensational discovery.

The press release provides no other information about why they think this darkening is because of comets rather than exoplanets or some other phenomenon. Based on this alone, I find this report very doubtful and highly speculative.

In related news, astronomers now claim they have detected eighteen more Earth-sized exoplanets in the data produced by Kepler, and they have done so by applying new algorithms to the data.

Large planets tend to produce deep and clear brightness variations of their host stars so that the subtle center-to-limb brightness variation on the star hardly plays a role in their discovery. Small planets, however, present scientists with immense challenges. Their effect on the stellar brightness is so small that it is extremely hard to distinguish from the natural brightness fluctuations of the star and from the noise that necessarily comes with any kind of observation. René Heller’s team has now been able to show that the sensitivity of the transit method can be significantly improved, if a more realistic light curve is assumed in the search algorithm. “Our new algorithm helps to draw a more realistic picture of the exoplanet population in space,” summarizes Michael Hippke of Sonneberg Observatory. “This method constitutes a significant step forward, especially in the search for Earth-like planets.”

This makes sense, but it must be understood that these are only candidate exoplanets, unconfirmed as yet. I would not be surprised if a majority are found to be false positives.

New Horizons data suggests the Kuiper Belt is emptier that previously believed

The uncertainty of science: An analysis of data from New Horizons now suggests a paucity of small objects in the Kuiper Belt.

Using New Horizons data from the Pluto-Charon flyby in 2015, a Southwest Research Institute-led team of scientists have indirectly discovered a distinct and surprising lack of very small objects in the Kuiper Belt. The evidence for the paucity of small Kuiper Belt objects (KBOs) comes from New Horizons imaging that revealed a dearth of small craters on Pluto’s largest satellite, Charon, indicating that impactors from 300 feet to 1 mile (91 meters to 1.6 km) in diameter must also be rare.

I therefore wonder how the objects we do find there formed. The volume of space in the Kuiper Belt is gigantic, and if the larger bodies found so far are the bulk of the objects there, what did they coalesce from? Moreover, it seems unlikely that the few large objects we have found there would have been able to clear the region out of small objects.

Overall, this is a fundamental mystery tied directly to how the solar system formed, and illustrates how little we know about that process.

Most popular theorized particle for explaining dark matter now eliminated

The uncertainty of science: The WIMP particle (Weakly Interacting Massive Particle), the most popular theorized particle to explain dark matter, has now been eliminated by experiments.

These experiments have now been ongoing for decades, and have seen no dark matter [WIMPs].

…Theorists can always tweak their models, and have done so many times, pushing the anticipated cross-section down and down as null result after null result rolls in. That’s the worst kind of science you can do, however: simply shifting the goalposts for no physical reason other than your experimental constraints have become more severe. There is no longer any motivation, other than preferring a conclusion that the data rules out, in doing.

Other theorized but less favored particles could still be proved to be dark matter, but the problem is getting harder and harder to solve, as presently assumed.

Dark matter has always been an invention created to explain the too-fast orbital velocities of stars in the other regions of galaxies. It could very well be however that the problem comes not from new physics and a newly contrived particle we can’t see, but from a deficiency in our overall observations of galaxies and what is there, within the constraints of the physics we know now.

Hat tip Mike Buford.

New sky survey uncovers hundreds of thousands of previously unknown galaxies

Galaxies without end: A new radio telescope sky survey has discovered hundreds of thousands of previously unknown galaxies.

This discovery is part of a major release of papers outlining a number of discoveries made by this new sky survey.

I could of course also subheaded this post “The uncertainty of science.” Wanna bet that even with this discovery we have only seen the tip of the iceberg of the number of galaxies out there?

The unfinished search for the Hubble constant

The uncertainty of science: Scientists continue to struggle in their still unfinished search for determining the precise expansion rate for the universe, dubbed the Hubble constant in honor of Edwin Hubble, who discovered that expansion.

The problem is, the values obtained from [two different] methods do not agree—a discrepancy cosmologists call “tension.” Calculations from redshift place the figure at about 73 (in units of kilometers per second per megaparsec); the CMB estimates are closer to 68. Most researchers first thought this divergence could be due to errors in measurements (known among astrophysicists as “systematics”). But despite years of investigation, scientists can find no source of error large enough to explain the gap.

I am especially amused by these numbers. Back in 1995 NASA had a big touted press conference to announce that new data from the Hubble Space Telescope had finally determined the exact number for the Hubble constant, 80 (using the standard above). The press went hog wild over this now “certain” conclusion, even though other astronomers disputed it, and offered lower numbers ranging from 30 to 65. Astronomer Allan Sandage of the Carnegie Observatories was especially critical of NASA’s certainty, and was dully ignored by most of the press.

In writing my own article about this result, I was especially struck during my phone interview with Wendy Friedman, the lead scientist for Hubble’s results, by her own certainty. When I noted that her data was very slim, the measurements of only a few stars from one galaxy, she poo-pooed this point. Her result had settled the question!

I didn’t buy her certainty then, and in my article, for The Sciences and entitled most appropriately “The Hubble Inconstant”, made it a point to note Sandage’s doubts. In the end it turns out that Sandage’s proposed number then of between 53 and 65 was a better prediction.

Still, the science for the final number remains unsettled, with two methods coming up with numbers that are a little less than a ten percent different, and no clear explanation for that difference. Isn’t science wonderful?

No Planet X needed

The uncertainty of science: New computer models now suggest that the orbits of the known Kuiper Belt objects can be explained without the need for the theorized large Planet X.

The weirdly clustered orbits of some far-flung bodies in our solar system can be explained without invoking a big, undiscovered “Planet Nine,” a new study suggests.

The shepherding gravitational pull could come from many fellow trans-Neptunian objects (TNOs) rather than a single massive world, according to the research.

“If you remove Planet Nine from the model, and instead allow for lots of small objects scattered across a wide area, collective attractions between those objects could just as easily account for the eccentric orbits we see in some TNOs,” study lead author Antranik Sefilian, a doctoral student in the Department of Applied Mathematics and Theoretical Physics at Cambridge University in England, said in a statement.

When you think about it, having many many scattered small objects in the Kuiper Belt makes much more sense than a few giant planets. Out there, it would be difficult for large objects to coalesce from the solar system’s initial accretion disk. The density of material would be too low. However, you might get a lot of small objects from that disk, which once formed would be too far apart to accrete into larger planets.

The use of the term “Planet Nine” by these scientists however is somewhat annoying, and that has less to do with Pluto and more to do with the general understanding of what it means to be a planet that has been evolving in the past two decades. There are clearly more than eight planets known in the solar system now. The large moons of the gas giants as well as the larger dwarf planets, such as Ceres, have been shown to have all the complex features of planets. And fundamentally, they are large enough to be spheres, not misshaped asteroids.

Four more gravitational wave detections

The uncertainty of science: The scientists running the LIGO gravitational wave detector have announced the detection of four more gravitational waves, bringing to eleven the total number so far observed.

During the first observing run O1, from September 12, 2015 to January 19, 2016, gravitational waves from three BBH mergers were detected. The second observing run, which lasted from November 30, 2016, to August 25, 2017, yielded a binary neutron star merger and seven additional binary black hole mergers, including the four new gravitational wave events being reported now. The new events are known as GW170729, GW170809, GW170818 and GW170823 based on the dates on which they were detected. With the detection of four additional BBH mergers the scientists learn more about the population of these binary systems in the universe and about the event rate for these types of coalescences.

The observed BBHs span a wide range of component masses, from 7.6 to 50.6 solar masses. The new event GW170729 is the most massive and distant gravitational-wave source ever observed. In this coalescence, which happened roughly 5 billion years ago, an equivalent energy of almost five solar masses was converted into gravitational radiation.

In two BBHs (GW151226 and GW170729) it is very likely that at least one of the merging black holes is spinning. One of the new events, GW170818, detected by the LIGO and Virgo observatories, was very precisely pinpointed in the sky. It is the best localized BBH to date: its position has been identified with a precision of 39 square degrees (195 times the apparent size of the full moon) in the northern celestial hemisphere. [emphasis mine]

The highlighted quote above illustrates the amount of uncertainty here. Though these appear to be gravitational waves, and have been confirmed in multiple ways, the data is very coarse, providing only a limited amount of basic information about each event. This limited information is still very valuable, and certainly advances our understanding of black holes and their formation, but it is important to recognize the limitations of that data.

Danish astronomers question gravitational wave detection

The uncertainty of science: A team of Danish astronomers have questioned the gravitational wave detection achieved in the past few years by the LIGO gravitational wave telescopes.

The details are complex and very much in dispute, and the position of these Danish astronomers is very much in the minority, but their doubts have not been dismissed, and illustrate well the best aspects science. The article also outlines how the physics community and the LIGO scientists have welcomed the skepticism, even as they have doubts about the claims of the Danish astronomers. This is the hallmark of good science, and lends weight to the work at LIGO.

Conflict in Hubble constant increases with new data from Hubble and Gaia

The uncertainty of science: New data from the Hubble Space Telescope and Gaia continues to measure a different Hubble constant for the expansion rate of the universe, when compared with data from the Planck space telescope.

Using Hubble and newly released data from Gaia, Riess’ team measured the present rate of expansion to be 73.5 kilometers (45.6 miles) per second per megaparsec. This means that for every 3.3 million light-years farther away a galaxy is from us, it appears to be moving 73.5 kilometers per second faster. However, the Planck results predict the universe should be expanding today at only 67.0 kilometers (41.6 miles) per second per megaparsec. As the teams’ measurements have become more and more precise, the chasm between them has continued to widen, and is now about 4 times the size of their combined uncertainty.

The problem really is very simple: We haven’t the faintest idea what is going on. We have some data, but we also have enormous gaps in our knowledge of the cosmos. Moreover, most of our cosmological data is reliant on too many assumptions that could be wrong, or simply in error. And the errors can be tiny and still throw the results off by large amounts.

The one thing that good science and skepticism teaches is humbleness. Do not be too sure of your conclusions. The universe is a large and complex place. It likes to throw curve balls at us, and if we swing too soon we will certainly miss.

Astronomers dispute existence of galaxy without dark matter

The uncertainty of science: A new analysis by astronomers disputes the conclusion of different astronomers earlier this year that they had found a galaxy that lacked any dark matter.

The original paper from March based its stunning claim of a dark-matter-free galaxy on the way clusters of stars moved through the thin, diffuse galaxy called NGC1052–DF2: They appeared to move at exactly the speed Einstein’s equations of general relativity would predict based on the visible matter (so, slower than they would if the galaxy held dark matter).

This new paper on arXiv suggested otherwise: First, the authors pointed out that NGC1052–DF2 was already discovered way back in 1976 and has previously been referred to by three different names: KKSG04, PGC3097693 and [KKS2000]04.

Then, using those names and then finding all the available data on the galaxy, the researchers argued that the researchers from the March paper simply mismeasured the distance between that galaxy and Earth. This means the galaxy is probably much closer to us than the original researchers thought.

Astronomers calculate the mass of galaxies based on the objects’ brightness and distance. If the galaxy examined in the paper is closer to Earth than previously thought, then its dimness means it’s also much less massive than researchers thought. And at the newly calculated, lighter mass, all the other features of the galaxy make a lot more sense, the researchers in the new paper said. Its globular clusters aren’t moving slowly because they’re in some strange dark matter-desert; instead, they’re moving at the regular speed for a very lightweight galaxy, the arXiv authors said.

To put it bluntly, the astronomers don’t have enough solid data to decide this issue one way or the other. Moreover, the dispute indicates once again that the whole dark matter theory itself is based on very limited data with large margins of error. It might be the best theory we’ve got to explain the data we have, but no good scientist takes it too seriously. We just don’t know enough yet.

1 2 3