Computer models suggest there is no life in Europa’s underground ocean

The uncertainty of science: Several different computer simulations now suggest that the underground ocean inside the Jupiter moon Europa is inert and likely harbors no existing lifeforms.

He and his colleagues constructed computer simulations of Europa’s seafloor, accounting for its gravity, the weight of the overlying ocean and the pressure of water within the seafloor itself. From the simulations, the team computed the strength of the rocks about 1 kilometer below the seafloor, or the stress required to force faults in the seafloor to slide and expose fresh rock to seawater.

Compared with the stress applied to the seafloor by Jupiter’s gravity and by the convection of material in Europa’s underlying mantle, the rocks comprising Europa’s seafloor are at least 10 times as strong, Byrne said. “The take home message is that the seafloor is likely geologically inert.”

A second computer model also suggested that the moon’s deep magna is not capable to pushing upward into that sea, further reinforcing the first model that the sea is geological inert, lacking the heat or energy required for life.

Though unconfirmed and uncertain, these results when looked at honestly make sense. Europa is a very cold world. An underground ocean might exist due to tidal forces imposed by Jupiter, but that dark and sunless ocean is also likely to be very hostile to life. Not enough energy to sustain it.

Like the water imagined to exist at poles of the Moon, we go to Europa on the hope of finding life, even if that hope is very ephermal.

The tangled view of astronomers

A protostar in formation
Click for original image.

The uncertainty of science: The picture to the right, cropped, reduced, and sharpened to post here, was taken by the Hubble Space Telescope as part of a survey of young stars surrounded by an edge-on dust disk. From the caption:

FS Tau is a multi-star system made up of FS Tau A, the bright star-like object near the middle of the image, and FS Tau B (Haro 6-5B), the bright object to the far right that is partially obscured by a dark, vertical lane of dust. The young objects are surrounded by softly illuminated gas and dust of this stellar nursery. The system is only about 2.8 million years old, very young for a star system. Our Sun, by contrast, is about 4.6 billion years old.

The blue lines on either side of that vertical dust lane are jets moving out from FS Tau B. The caption says their asymetrical lengths are likely due to ” mass is being expelled from the object at different rates,” but it just as easily be caused by the angle in which we see this object, making the nearer jet seem longer than the one behind.

That astronomers cannot move around such an object and see it from many angles explains the headline of this post. We can only see astronomical objects from one angle, and when they are complex objects such as this one, a large part of the research problem is disentangling the shapes we see into a coherent picture. Spectroscopy helps a lot, as it provides information about the speed and direction of different parts of the object, but even this can be enormously complicated and difficult to interpret.

Remember these facts when you read news reports about astronomical research. No matter how certain the press release sounds, its certainty is always tempered by many unknowns, some very pedestrian but fundamental.

Hubble and Webb confirm decade-long conflict in universe’s expansion rate

The uncertainty of science: New data from both the Hubble and Webb space telescopes has confirmed Hubble’s previous measurement of the rate of the Hubble constant, the rate in which the universe is expanding. The problem is that these numbers still differ significantly from the expansion rate determined by the observations of the cosmic microwave background by the Planck space telescope.

Hubble and Webb come up with a rate of expansion 73 km/s/Mpc, while Planck found an expansion rate of 67 km/s/Mpc. Though this difference appears small, the scientists in both groups claim their margin of error is much smaller than that difference, which means both can’t be right.

You can read the paper for these new results here.

The bottom line mystery remains: The data is clearly telling us one of two things: 1) the many assumptions that go into these numbers might be incorrect, explaining the difference, or 2) there is something fundamentally wrong about the Big Bang theory that cosmologists have been promoting for more than a half century as the only explanation for the formation of the universe.

The solution could also be a combination of both. Our data and our theories are wrong.

Sunspot update: Are we now in the next solar maximum?

Time for my monthly update on the Sun’s sunspot activity has it proceeds through its eleven-year sunspot cycle. NOAA has released its update of its monthly graph showing the number of sunspots on the Sun’s Earth-facing hemisphere, and I have posted it below, annotated with further details to provide a larger context.

In December sunspot activity increased slightly for the second month in a row, but only by a little bit. The number of sunspots for the month was still significantly below the highs seen in the summer, and continue to suggest that the Sun has already entered solar maximum (two years early), and like the previous two solar maximums in 2001 and 2013, will be double peaked.
» Read more

The uncertainty of science as proven by the Webb Space Telescope

A long detailed article was released today at Space.com, describing the many contradictions in the data coming back from the Webb Space Telescope that seriously challenge all the theories of cosmologists about the nature of the universe as well as its beginning in a single Big Bang.

The article is definitely worth reading, but be warned that it treats science as a certainty that should never have such contradictions, as illustrated first by its very headline: “After 2 years in space, the James Webb Space Telescope has broken cosmology. Can it be fixed?”

“Science” isn’t broken in the slightest. All Webb has done is provide new data that does not fit the theories. As physicist Richard Feynman once stated bluntly in teaching students the scientific method,

“It doesn’t make a difference how beautiful your guess is, it doesn’t make a difference how smart you are, who made the guess, or what his name is. If it disagrees with experiment, it’s wrong.”

Cosmologists for decades have been guessing in proposing their theories about the Big Bang, the expansion of the universe, and dark matter, based on only a tiny amount of data that had been obtained with enormous assumptions and uncertainties. It is therefore not surprising (nor was it ever surprising) that Webb has blown holes in their theories.

For example, the article spends a lot of time discussing the Hubble constant, describing how observations using different instruments (including Webb) have come up with two conflicting numbers for it — either 67 or 74 kilometers per second per megaparsec. No one can resolve this contradiction. No theory explains it.

To me the irony is that back in the 1990s, when Hubble made its first good measurements of the Hubble constant, these same scientists were certain then that the number Hubble came up with, around 90 kilometers per second per megaparsec, was now correct.

They didn’t really understand reality then, and they don’t yet understand it now.

What cosmologists must do is back away from their theories and recognize the vast areas of ignorance that exist. Once that is done, they might have a chance to resolve the conflict between the data obtained and the theories proposed, and come up with new theories that might work (with great emphasis on the word “might”). Complaining about the paradoxes will accomplish nothing.

Sunspot update: The Sun continues to prove that solar scientists understand nothing

With today’s monthly update from NOAA of its graph tracking the number of sunspots on the Sun’s Earth-facing hemisphere, we find that the Sun continues to confound the experts. As I do every month, I have posted this graph below, with additional details to provide the larger context.

In November the sunspot count rose slightly, but remained well below the highs that had occurred through most of the first half of 2023. Yet, despite that continuing reduction in the number of sunspots, the overall amount of activity remains above the prediction of some scientists, and below the prediction of other scientists.
» Read more

Sunspot update: October activity drops almost to predicted levels

NOAA today posted its updated monthly graph tracking the number of sunspots on the Sun’s Earth-facing hemisphere. As I do every month, I have posted this graph below, with several additional details to provide some larger context.

In October the sunspot count dropped so much from the activity in September that the total count was for the first time since the middle of 2021 actually very close to the predicted numbers first put forth by NOAA’s solar science panel in April 2020.

» Read more

The orbits of the nearest stars orbiting the Milky Way’s central black hole are impossible to predict

The uncertainty of science: Using a computer program developed in 2018 that can predict with accuracy the orbits of more than three interacting objects, scientists have found that the orbits of the 27 nearest stars orbiting the Milky Way’s central black hole, Sagittarius A* (pronounced A-star) are impossible to predict after only a very short time.

“Already after 462 years, we cannot predict the orbits with confidence. That is astonishingly short,” says astronomer Simon Portegies Zwart (Leiden University, the Netherlands). He compares it to our solar system, which is no longer predictable with confidence after 12 million years. “So, the vicinity of the black hole is 30,000 times more chaotic than ours, and we didn’t expect that at all. Of course, the solar system is about 20,000 times smaller, contains millions of times less mass, and has only eight relatively light objects instead of 27 massive ones, but, if you had asked me beforehand, that shouldn’t have mattered so much.”

According to the researchers, the chaos emerges each time in roughly the same way. There are always two or three stars that approach each other closely. This causes a mutual pushing and pulling among the stars. This in turn leads to slightly different stellar orbits. The black hole around which those stars orbit is then slightly pushed away, which in turn is felt by all the stars. In this way, a small interaction between two stars affects all 27 stars in the central cluster. [emphasis mine]

To my mind, the quote by the scientist above should be considered the most absurd statement by a scientist ever spoken, except that nowadays scientists make such idiotic statements all the time. To think that such different conditions wouldn’t produce different results suggests a hubris that is astonishing for a person supposedly trained in the scientific method.

Regardless, these results suggest that acquiring an understanding of the dynamics that created these stars is going to be very difficult, if not impossible. The conditions change so rapidly, and in an unpredictable manner, that any theory proposed will be simply guessing.

The New York Times suddenly allows two scientists to admit the Big Bang theory might be wrong

Modern science
Modern science

The refusal by many in the scientific community to deny there is any uncertainty of science has been best illustrated for decades by the cosmologists who have put together the framework of the standard model for the creation of the universe, centered on the Big Bang, and their pitchmen in the mainstream press. Since the 1960s any skepticism of this model was generally treated as equivalent to believing in UFO’s, aliens, and the Face on Mars.

Thus, astronomers and astrophysicists did what necessary to protect their careers. Even if they had great doubts about the standard model and the Big Bang, they generally kept their mouths shut, saying nothing. Meanwhile, our increasingly corrupt press pushed this one explanation for the formation of the universe, treating the cosmologists who pushed it as Gods whose every word was equivalent of an oracle that must never be questioned.

This past weekend the New York Times suddenly admitted to the uncertainty surrounding the Big Bang, and for possibly the first time in decades allowed two scientists to write an op-ed that carefully outlined the problems with the standard model and the Big Bang theory, problems that have existed and been growing since the 1990s but have been poo-pooed as inconsequential and easily solved. Data from the Webb Space Telescope however has made that poo-pooing more and more difficult, as astrophysicists Adam Frank and Marcelo Gleiser make clear:
» Read more

Webb confirms galaxy as one of the earliest known in the universe

The uncertainty of science: Using the spectroscopic instrument on the Webb Space Telescope, scientists have confirmed that one of the first galaxies found by Webb, dubbed Maisie’s Galaxy after the daughter of one scientist, is one of the earliest known in the universe, existing only 390 million years after when cosmologies say the Big Bang happened.

The data also showed that another one of these early galaxies spotted by Webb did not exist 250 million years after the Big Bang, but one billion years after, a date that better fits the theories about the early universe, based on the nature of this galaxy.

It turns out that hot gas in CEERS-93316 was emitting so much light in a few narrow frequency bands associated with oxygen and hydrogen that it made the galaxy appear much bluer than it really was. That blue cast mimicked the signature Finkelstein and others expected to see in very early galaxies. This is due to a quirk of the photometric method that happens only for objects with redshifts of about 4.9. Finkelstein says this was a case of bad luck. “This was a kind of weird case,” Finkelstein said. “Of the many tens of high redshift candidates that have been observed spectroscopically, this is the only instance of the true redshift being much less than our initial guess.”

Not only does this galaxy appear unnaturally blue, it also is much brighter than our current models predict for galaxies that formed so early in the universe. “It would have been really challenging to explain how the universe could create such a massive galaxy so soon,” Finkelstein said. “So, I think this was probably always the most likely outcome, because it was so extreme, so bright, at such an apparent high redshift.”

This science team is presently using Webb’s spectroscope to study ten early galaxies in order to better determine their age. Expect more results momentarily.

Scientists increasingly put politics over uncertainty in their research papers

The modern scientific method
The modern scientific method

The death of uncertainty in science: According to a paper published this week in the peer-review journal Science, scientists in recent years are increasingly abandoning uncertainty in their research papers and are instead more willing to make claims of absolute certainty without hesitation or even proof.

If this trend holds across the scientific literature, it suggests a worrisome rise of unreliable, exaggerated claims, some observers say. Hedging and avoiding overconfidence “are vital to communicating what one’s data can actually say and what it merely implies,” says Melissa Wheeler, a social psychologist at the Swinburne University of Technology who was not involved in the study. “If academic writing becomes more about the rhetoric … it will become more difficult for readers to decipher what is groundbreaking and truly novel.”

The new analysis, one of the largest of its kind, examined more than 2600 research articles published from 1997 to 2021 in Science, which the team chose because it publishes articles from multiple disciplines. (Science’s news team is independent from the editorial side.) The team searched the papers for about 50 terms such as “could,” “appear to,” “approximately,” and “seem.” The frequency of these hedging words dropped from 115.8 instances per 10,000 words in 1997 to 67.42 per 10,000 words in 2021.

Those numbers represent a 40% decline, a trend that has been clear for decades, first becoming obvious in the climate field. » Read more

Astronomers make first radio observations of key type of supernova

The uncertainty of science: Using a variety of telescopes, astronomers have not only made the first radio observations of key type of supernova, they have also detected helium in the data, suggesting that this particular supernova of that type was still atypical.

This marks the first confirmed Type Ia supernova triggered by a white dwarf star that pulled material from a companion star with an outer layer consisting primarily of helium; normally, in the rare cases where the material stripped from the outer layers of the donor star could be detected in spectra, this was mostly hydrogen.

Type Ia supernovae are important for astronomers since they are used to measure the expansion of the universe. However, the origin of these explosions has remained an open question. While it is established that the explosion is caused by a compact white dwarf star that somehow accretes too much matter from a companion star, the exact process and the nature of the progenitor is not known. [emphasis mine]

The highlighted sentences are really the most important take-away from this research. Type Ia supernovae were the phenomenon used by cosmologists to detect the unexpected acceleration of the universe’s expansion billions of years ago. That research assumed these supernovae were well understood and consistently produced the same amount of energy and light, no matter how far away they were or the specific conditions which caused them.

This new supernovae research illustrates how absurd that assumption was. Type Ia supernovae are produced by the interaction of two stars, both of which could have innumerable unique features. It is therefore unreasonable as a scientist to assume all such supernovae are going to be identical in their output. And yet, that is what the cosmologists did in declaring the discovery of dark energy in the late 1990s.

It is also what the scientists who performed this research do. To quote one of the co-authors: “While normal Type Ia supernovae appear to always explode with the same brightness, this supernova tells us that there are many different pathways to a white dwarf star explosion.”

Forgive me if I remain very skeptical.

Webb spots massive galaxies in the early universe that should not exist at that time

The uncertainty of science: Astronomers using the Webb Space Telescope have identified six galaxies that are far too massive and evolved to have formed so quickly after the Big Bang.

The research, published today in Nature, could upend our model of the Universe and force a drastic rethink of how the first galaxies formed after the Big Bang. “We’ve never observed galaxies of this colossal size, this early on after the Big Bang,” says lead researcher Associate Professor Ivo Labbé from Swinburne University of Technology.

“The six galaxies we found are more than 12 billion years old, only 500 to 700 million years after the Big Bang, reaching sizes up to 100 billion times the mass of our sun. This is too big to even exist within current models.

You can read the paper here [pdf]. The “current models” Labbé is referring to are all the present theories and data that say the Big Bang occurred 13.7 billion years ago. These galaxies, however, found less than a billion years after that event, would have needed 12 billion years to have accumulated their mass.

If confirmed, these galaxies essentially tell us that the Big Bang is wrong, or very very VERY incomplete, and that all the data found that dates its occurrence 13.7 billion years ago, based on the Hubble constant, must be reanalyzed.

It is also possible these galaxies are actually not galaxies, but a new kind of supermassive black hole able to form very quickly. Expect many scientists who are heavily invested in the Big Bang to push for this explanation. It might be true, but their biases are true also, which means that Webb is presenting us with new data that calls for strong skepticism of all conclusions, across the board.

Chinese scientists detect a fast radio burst that defies the theories

The uncertainty of science: Using their large FAST radio telescope, Chinese scientists revealed this week that they have detected a new fast radio burst (FRB) whose behavior and location does not fit the present tentative theories for explaining these mysterious deep space objects.

The FRB was an exception from the beginning as it flared again and again in observations recorded by the Five-hundred-meter Aperture Spherical radio Telescope (FAST), which nestles among the hills of China’s Guizhou province. The multiple flares put the source among the few percent of FRBs that repeat. But unlike most repeaters, this one doesn’t have any apparent cycle of bursting and quiescence.

“FRB 20190520B is the only persistently repeating fast radio burst known so far, meaning that it has not been seen to turn off,” Li says.

In addition, whatever made the FRB is also emitting a constant buzz of radio waves. Astronomers have found an association with a persistent radio source in only two other FRBs, and for one of these the low-level radio waves seem to come from ongoing star formation in the host galaxy. For FRB 20190520B, though, the radio source is far more compact, and Li’s team thinks the radio waves probably come from the FRB source itself.

The data also suggests the location does not fit the theories, and even suggests that FRBs might not all come from magnetars, as presently proposed.

Neptune’s cooling when it should be warming

Neptune since 2006

The uncertainty of science: Observations of Neptune during the past seventeen years using the Very Large Telescope have shown the planet mostly cooling during this time period, even though Neptune was moving into its summer season.

Astronomers looked at nearly 100 thermal-infrared images of Neptune, captured over a 17-year period, to piece together overall trends in the planet’s temperature in greater detail than ever before. These data showed that, despite the onset of southern summer, most of the planet had gradually cooled over the last two decades. The globally averaged temperature of Neptune dropped by 8 °C between 2003 and 2018.

The astronomers were then surprised to discover a dramatic warming of Neptune’s south pole during the last two years of their observations, when temperatures rapidly rose 11 °C between 2018 and 2020. Although Neptune’s warm polar vortex has been known for many years, such rapid polar warming has never been previously observed on the planet. “Our data cover less than half of a Neptune season, so no one was expecting to see large and rapid changes,” says co-author Glenn Orton, senior research scientist at Caltech’s Jet Propulsion Laboratory (JPL) in the US.

The sequence of photos above show that change over time. Lower latitudes generally get darker, or cooler, while the south pole suddenly brightens, getting hotter, in 2020.

The scientists have no idea why this has happened, though they have theories, ranging from simple random weather patterns to the influence of the Sun’s sunspot cycle.

New data contradicts accepted standard model of particle physics

The uncertainty of science: After years of analysis, physicists have refined their measurement of the mass of one important subatomic particle, and discovered that its weight violates the accepted standard model of particle physics, threatening to overthrow it entirely.

W bosons are elementary particles that carry the weak force, mediating nuclear processes like those at work in the Sun. According to the Standard Model, their mass is linked to the masses of the Higgs boson and a subatomic particle called the top quark. In a new study, almost 400 scientists on the Collider Detector at Fermilab (CDF) collaboration spent a decade examining 4.2 million W boson candidates collected from 26 years of data at the Tevatron collider. From this treasure trove, the team was able to calculate the mass of the W boson to within 0.01 percent, making it twice as precise as the previous best measurement.

By their calculations, the W boson has a mass of 80,433.5 Mega-electronvolts (MeV), with an uncertainty of just 9.4 MeV either side. That’s within the range of some previous measurements, but well outside that predicted by the Standard Model, which puts it at 80,357 MeV, give or take 6 MeV. That means the new value is off by a whopping seven standard deviations.

Further cementing the anomaly, the W boson mass was also recently measured using data from the Large Hadron Collider, in a paper published in January. That team came to a value of 80,354 MeV (+/- 32 MeV), which is comfortably close to that given by the Standard Model.

Personally, I always take this level of physics with a great deal of skepticism. The data involves a lot of assumptions and uncertainties. That other researchers came up with a different number illustrates this.

Nonetheless, these results could suggest that the standard model, the consensus theory for decades, is either incomplete, or wrong. The former would be more likely, but no possibility should be dismissed. And even if wrong, much of that model still works so well any new model must include large parts of it.

Scientists: Enceladus’ tiger stripes come from underground ocean

The uncertainty of science: Using a new computer model, scientists now think they have shown how on the Saturn moon Enceladus pressure from an underground ocean can push through cracks to produce geysers on the surface.

Rudolph and his colleagues ran a physics-based model to map the conditions that could allow the cracks from the surface to reach the ocean and cause the eruptions. The model accounts for cycles of warming and cooling that last on the scale of a hundred million years, associated with changes in Enceladus’ orbit around Saturn. During each cycle, the ice shell undergoes a period of thinning and a period of thickening. The thickening happens through freezing at the base of the ice shell, which grows downward like the ice on a lake, Rudolph said.

The pressure exerted by this downward-expanding ice on the ocean below is one possible mechanism researchers have proposed to explain Enceladus’ geysers. As the outer ice shell cools and thickens, pressure increases on the ocean underneath because ice has more volume than water. The increasing pressure also generates stress in the ice, which could become pathways for fluid to reach the surface 20-30 kilometers away.

You can read the paper here.

Be warned: This is only a model. Moreover, its conclusions suggest that this mechanism will not work on Jupiter’s moon Europa, which has many planet-wide crack-like features that suggest (as yet unconfirmed) a bubbling up from below.

Sharpest radio image ever taken of newly discovered space object

The first known Odd Radio Circle (ORC)
Click for full image.

The uncertainty of science: Astronomers, using the MeerKat radio telescope in South Africa, have taken the best radio image yet of a newly discovered type of astronomical object, dubbed whimsically as an odd radio circle (ORC).

The photo to the right is that image. While it is reminiscent of the many planetary nebulae seen in visible light that astronomers have been studying since the 1800s, this weird shape is only seen in radio frequencies, and it is much much larger.

Odd radio circles are so named because they’re large, circular objects which are bright around the edges at radio wavelengths, but which can’t be seen with optical, infrared or X-ray telescopes – and at this stage, astronomers don’t really know what they are.

And they’re massive – about a million light years across, making them sixteen times larger than our own galaxy. But despite their gargantuan size, the objects are difficult to spot, hiding in plain sight.

Planetary nebulae are generally the size of solar systems.

Have astronomers discovered an asteroid with three moons?

The uncertainty of science: According to a New York Times article published yesterday, astronomers have now discovered an asteroid with three moons, making it the first such asteroid ever discovered.

The asteroid, Electra, was first discovered in 1873, and orbits the Sun in the asteroid belt between Mars and Jupiter. It was already known that it had two moons. The new research thinks it has found a third.

In reading the researchers’ the actual paper describing this research, it appears that the Times is spinning the data, making it sound more certain than it is. This new result comes entirely from new software that was designed to better resolve archival images of the asteroid, and thus carries a lot of uncertainty. From the paper’s conclusion:

[There remain] a lot of uncertainties remain concerning the orbit of S3 [the new asteroid moon]. More data on S2 and S3, as well as a more thorough dynamical study are necessary to solve the problem of the motion of the satellites of Elektra. However, the discovery of the first quadruple asteroid system slightly opens the way for understanding the mechanisms of the formation of these satellites.

In terms of data processing, S3 is barely visible in the data reduced with the standard pipeline and processed with standard halo removal algorithms and it was missed until now.

The paper repeatedly notes that the orbital data for Electra and its moons is presently “poorly constrained” and needs significant refinement in order to confirm this result.

If true, however, these moons are probably pieces that broke off of Electra at some point in the past, and could provide good information about the long term history of asteroids in the asteroid belt.

Conflict in Hubble constant continues to confound astronomers

The uncertainty of science: In reviewing their measurements of the Hubble constant using a variety of proxy distance tools, such as distant supernovae, astronomers recently announced that their numbers must be right, even though those numbers do not match the Hubble constant measured using completely different tools.

Most measurements of the current acceleration of the universe (called the Hubble constant, or H0) based on stars and other objects relatively close to Earth give a rate of 73 km/s/Mpc. These are referred to as “late-time” measurements [the same as confirmed by the astronomers in the above report]. On the other hand, early-time measurements, which are based on the cosmic microwave background emitted just 380,000 years after the Big Bang, give a smaller rate of 68 km/s/Mpc.

They can’t both be right. Either something is wrong with the standard cosmological model for our universe’s evolution, upon which the early-time measurements rest, or something is wrong with the way scientists are working with late-time observations.

The astronomers are now claiming that their late-time observations must be right, which really means there is either something about the present theories about the Big Bang that are fundamentally wrong and that our understanding of early cosmology is very incomplete, or the measurements by everyone are faulty.

Based on the number of assumptions used with both measurements, it is not surprising the results don’t match. Some of those assumptions are certainly wrong, but to correct the error will require a lot more data that will only become available when astronomers have much bigger telescopes of all kinds, in space and above the atmosphere. Their present tools on Earth are insufficient for untangling this mystery.

Astronomers discover galaxy with no dark matter

The uncertainty of science: Astronomers have detected a galaxy about 250 million light years away that shows no evidence of any dark matter, a phenomenon that defies the accepted theories about dark matter.

The galaxy in question, AGC 114905, is about 250 million light-years away. It is classified as an ultra-diffuse dwarf galaxy, with the name ‘dwarf galaxy’ referring to its luminosity and not to its size. The galaxy is about the size of our own Milky Way but contains a thousand times fewer stars. The prevailing idea is that all galaxies, and certainly ultra-diffuse dwarf galaxies, can only exist if they are held together by dark matter.
Galaxy AGC 114905

The researchers collected data on the rotation of gas in AGC 114905 for 40 hours between July and October 2020 using the VLA telescope. Subsequently, they made a graph showing the distance of the gas from the center of the galaxy on the x-axis and the rotation speed of the gas on the y-axis. This is a standard way to reveal the presence of dark matter. The graph shows that the motions of the gas in AGC 114905 can be completely explained by just normal matter.

“This is, of course, what we thought and hoped for because it confirms our previous measurements,” says Pavel Mancera Piña. “But now the problem remains that the theory predicts that there must be dark matter in AGC 114905, but our observations say there isn’t. In fact, the difference between theory and observation is only getting bigger.”

The evidence for dark matter in almost all galaxies is the motion of gas and stars in the outer perimeter. Routinely they move faster than expected based merely on visible ordinary matter. To account for the faster speed, astronomers beginning in the late 1950s invented dark matter, an invisible material with a mass sufficient to increase the speeds of objects and gas in the outer regions of galaxies.

That increasingly astronomers are finding galaxies with no evidence of dark matter, based on rotation speeds, only makes this mystery all the more baffling.

Galaxies in the early universe don’t fit the theories

The uncertainty of science: New data from both the ALMA telescope in Chile and the Hubble Space Telescope about six massive galaxies in the early universe suggest that there are problems and gaps in the presently accepted theories about the universe’s formation.

Early massive galaxies—those that formed in the three billion years following the Big Bang should have contained large amounts of cold hydrogen gas, the fuel required to make stars. But scientists observing the early Universe with the Atacama Large Millimeter/submillimeter Array (ALMA) and the Hubble Space Telescope have spotted something strange: half a dozen early massive galaxies that ran out of fuel. The results of the research are published today in Nature.

Known as “quenched” galaxies—or galaxies that have shut down star formation—the six galaxies selected for observation from the REsolving QUIEscent Magnified galaxies at high redshift. or the REQUIEM survey, are inconsistent with what astronomers expect of the early Universe.

It was expected that the early universe would have lots of that cold hydrogen for making stars. For some galaxies to lack that gas is inexplicable, and raises questions about the assumptions inherent in the theory of the Big Bang. It doesn’t disprove it, it simply makes it harder to fit the facts to the theory, suggesting — as is always the case — that the reality is far more complicated than the theories of scientists.

Scientists: Clay, not liquid water, explains radar data under Martian south icecap

The uncertainty of science: In a new paper scientists claim that clay materials, not liquid water, better explain the radar data obtained by orbital satellites, initially hypothesized to be liquid water lakes under Mars’ south polar icecap.

Sub-glacial lakes were first reported in 2018 and caused a big stir because of the potential for habitability on Mars. Astrobiologists and non-scientists were equally attracted to the exciting news. Now, the solution to this question, with great import to the planetary science community, may be much more mundane than bodies of water on Mars.

The strength of this new study is the diversity of techniques employed. “Our study combined theoretical modeling with laboratory measurements and remote sensing observations from The Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) instrument on NASA’s Mars Reconnaissance Orbiter. All three agreed that smectites can make the reflections and that smectites are present at the south pole of Mars. It’s the trifecta: measure the material properties, show that the material properties can explain the observation, and demonstrate that the materials are present at the site of the observation,” Smith said.

This paper is only one of several recently that has popped the balloon on the liquid lake theory. Nothing is actually proven, but the weight of evidence is definitely moving away from underground liquid water under the south pole icecap.

An astrophysicist explains cosmology’s theoretical failures

Link here. The astrophysicist, Paul Sutter, does a very nice job of outlining the conundrum that has been causing astrophysicists to tear their hair out for the past decade-plus.

In the two decades since astronomers discovered dark energy, we’ve come upon a little hitch: Measurements of the expansion rate of the universe (and so its age) from both the CMB [cosmic microwave background] and supernovas have gotten ever more precise, but they’re starting to disagree. We’re not talking much; the two methods are separated by only 10 million or 20 million years in estimating the 13.77-billion-year history of the universe. But we’re operating at such a level of precision that it’s worth talking about.

If anything, this failure for two measurements of data spanning billions of light years — which is billions in both time and space — is a perfect illustration of the uncertainty of science. Astrophysicists are trying to come up with answers based on data that is quite thin, with many gaps in knowledge, and carries with it many assumptions. It therefore is actually surprising that these two numbers agree as well as they do.

Sutter, being in the CMB camp, puts most of the blame for this failure on the uncertainty of what we know about supernovae. He could very well be right. The assumptions about supernovae used to measure the expansion rate of the universe are many. There is also a lot of gaps in our knowledge, including a full understanding of the process that produces supernovae.

Sutter however I think puts too much faith in theoretical conclusions of the astrophysics community that have determined the age of the universe based on the CMB. The uncertainties here are as great. Good scientists should remain skeptical of this as well. Our knowledge of physics is still incomplete. Physicists really don’t know all the answers, yet.

In the end, Sutter however does pin down the biggest problem in cosmology:

The “crisis” is a good excuse to keep writing papers, because we’ve been stumped by dark energy for over two decades, with a lot of work and not much understanding. In a sense, many cosmologists want to keep the crisis going, because as long as it exists, they have something to talk about other than counting down the years to the next big mission.

In other words, the discussion now is sometimes less about science and theories and cosmology, but instead about funding and career promotion. What a shock!

Scientists successfully predict resumption of bursts from magnetar

The uncertainty of science: Though they have no real idea why it happens, scientists have now successfully predicted the resumption of energetic bursts coming from a magnetar and according to schedule.

The researchers — Grossan and theoretical physicist and cosmologist Eric Linder from SSL and the Berkeley Center for Cosmological Physics and postdoctoral fellow Mikhail Denissenya from Nazarbayev University in Kazakhstan — discovered the pattern last year in bursts from a soft gamma repeater, SGR1935+2154, that is a magnetar, a prolific source of soft or lower energy gamma ray bursts and the only known source of fast radio bursts within our Milky Way galaxy. They found that the object emits bursts randomly, but only within regular four-month windows of time, each active window separated by three months of inactivity.

On March 19, the team uploaded a preprint claiming “periodic windowed behavior” in soft gamma bursts from SGR1935+2154 and predicted that these bursts would start up again after June 1 — following a three month hiatus — and could occur throughout a four-month window ending Oct. 7.

On June 24, three weeks into the window of activity, the first new burst from SGR1935+2154 was observed after the predicted three month gap, and nearly a dozen more bursts have been observed since, including one on July 6.

They made this prediction based on data going back to 2014 that showed the three-month-off/four-month-on pattern.

As to why this pattern exists, they presently have no idea. Theories have been proposed, such as starquakes activated by the magnetar’s fast rotation or blocking clouds of gas, but none are really very convincing, or are backed with enough data.

Gravitational wave detectors see two different black holes as they swallowed a neutron star

Astronomers using three different gravitational wave detectors have seen the gravity ripples caused when two different black holes swallowed a nearby neutron star.

The two gravitational-wave events, dubbed GW200105 and GW200115, rippled through detectors only 10 days apart, on January 5, 2020, and January 15, 2020, respectively.

Each merger involved a fairly small black hole (less than 10 Suns in heft) paired with an object between 1½ and 2 solar masses — right in the expected range for neutron stars. Observers caught no glow from the collisions, but given that both crashes happened roughly 900 million light-years away, spotting a flash was improbable, even if one happened — and it likely didn’t: The black holes are large enough that they would have gobbled the neutron stars whole instead of ripping them into bite-size pieces.

Note the time between the detection, in early 2020, and its announcement now, in mid-2021. The data is very complex and filled with a lot of noise, requiring many months of analysis to determine if a detection was made. For example, in a third case one detector was thought to have seen another such merger but scientists remain unsure. It might simply be noise in the system. I point this out to emphasize that thought they are much more confident in these new detections, there remains some uncertainty.

New data confirms lack of dark matter in one galaxy

The uncertainty of science: Astronomers have strengthened their evidence that one particular nearby galaxy is completely devoid of dark matter, a situation that challenges the existing theories about dark matter which suggest it comprises the bulk of all matter in the universe.

The astronomers had made their first claim that this galaxy, NGC 1052-DF2, lacked dark matter back in 2018, a claim that was strongly disputed by others.

The claim however would only hold up if the galaxy’s distance from Earth was as far away as they then estimated, 65 million light years (not the 42 million light years estimated by others). If it were closer, as other scientists insisted, then NCC 1052-DF2 likely did have dark matter, and the theorists could sleep at night knowing that their theory about dark matter was right.

To test their claim, the astronomers used the Hubble Space Telescope to get a better, more tightly constrained estimate of the distance, and discovered the galaxy was even farther away then previously believed.

Team member Zili Shen, from Yale University, says that the new Hubble observations help them confirm that DF2 is not only farther from Earth than some astronomers suggest, but also slightly more distant than the team’s original estimates.

The new distance estimate is that DF2 is 72 million light-years as opposed to 42 million light-years, as reported by other independent teams. This places the galaxy farther than the original Hubble 2018 estimate of 65 light-years distance.

So, does this discovery invalidate the theories about dark matter? Yes and no. The theories now have to account for the existence of galaxies with no dark matter. Previously it was assumed that dark matter was to be found as blobs at the locations of all galaxies. Apparently it is not.

However, the lack of dark matter at this one galaxy does not prove that dark matter is not real. As noted by the lead astronomer in this research,

“In our 2018 paper, we suggested that if you have a galaxy without dark matter, and other similar galaxies seem to have it, that means that dark matter is actually real and it exists,” van Dokkum said. “It’s not a mirage.

Ah, the uncertainty of science. Isn’t it wonderful?

New data suggests muon is more magnetic that predicted

The uncertainty of science: New data now suggests that the subatomic particle called the muon is slightly more magnetic that predicted by the standard model of particle physics, a result that if confirmed will require a major rethinking of that standard model.

In 2001, researchers with the Muon g-2 experiment, then at Brookhaven, reported that the muon was a touch more magnetic than the standard model predicts. The discrepancy was only about 2.5 times the combined theoretical and experimental uncertainties. That’s nowhere near physicists’ standard for claiming a discovery: 5 times the total uncertainty. But it was a tantalizing hint of new particles just beyond their grasp.

So in 2013, researchers hauled the experiment to Fermi National Accelerator Laboratory (Fermilab) in Illinois, where they could get purer beams of muons. By the time the revamped experiment started to take data in 2018, the standard model predictions of the muon’s magnetism had improved and the difference between the experimental results and theory had risen to 3.7 times the total uncertainty.

Now, the g-2 team has released the first result from the revamped experiment, using 1 year’s worth of data. And the new result agrees almost exactly with the old one, the team announced today at a symposium at Fermilab. The concordance shows the old result was neither a statistical fluke nor the product of some undetected flaw in the experiment, says Chris Polly, a Fermilab physicist and co-spokesperson for the g-2 team. “Because I was a graduate student on the Brookhaven experiment, it was certainly an overwhelming sense of relief for me,” he says.

Together, the new and old results widen the disagreement with the standard model prediction to 4.2 times the experimental and theoretical errors.

That result is still not five times what theory predicts — the faux standard physicists apparently use to separate a simple margin of error and a true discovery — but it is almost that high, has been found consistently in repeated tests, and appears to be an unexplained discrepancy.

Not that I take any of this too seriously. If you read the entire article, you will understand. There are so many areas of uncertainty, both in the data and in the theories that this research is founded on, that the wise course is to treat it all with a great deal of skepticism. For example, the anomaly reported involves only 2.5 parts in 1 billion. While this data is definitely telling us something, but it is so close to the edge of infinitesimal that one shouldn’t trust it deeply.

Scientists: Mars is losing water seasonally through its atmosphere

The uncertainty of science: Two new studies using data Europe’s Trace Gas Orbiter and Mars Express orbiters have found that Mars is losing water seasonally through its atmosphere.

The studies also found that global dust storms accelerate the process.

Anna and colleagues found that water vapour remained confined to below 60 km when Mars was far from the Sun but extended up to 90 km in altitude when Mars was closest to the Sun. Across a full orbit, the distance between the Sun and the Red Planet ranges from 207 million to 249 million km.

Near the Sun, the warmer temperatures and more intensive circulation in the atmosphere prevented water from freezing out at a certain altitude. “Then, the upper atmosphere becomes moistened and saturated with water, explaining why water escape rates speed up during this season – water is carried higher, aiding its escape to space,” adds Anna.

In years when Mars experienced a global dust storm the upper atmosphere became even wetter, accumulating water in excess at altitudes of over 80 km.

But wait, didn’t planetary scientists just announce that Mars hasn’t lost its water through the atmosphere, but instead lost it when it became chemical trapped in the planet’s soil? Yup, they did, but that was a model based on new ground data. This new result is based on atmospheric data.

Or to put it another way, the model was incomplete. While it could be true that a large bulk of Mars’ water is trapped chemically in the ground, that is not proven, only hypothesized. What has been proven, and is now confirmed by these two studies, is that, depending on weather and season, the water of Mars does leak into its upper atmosphere where it can escape into space, never to return.

What remains unknown is how much water escaped into space, and when. Moreover, the ground-based model could still be right, even if it is true that Mars is losing water through its atmosphere. At the moment the data is too incomplete to answer these questions with any certainty.

Meanwhile, this press release once again gives the false impression that the only water left on Mars is at its poles (and in this case, only the south pole). This is not accurate, based on numerous studies finding evidence of buried ice and glaciers everywhere on the planet down to the 30th latitude, in both the north and south hemispheres. Mars might have far less water now than it did billions of years ago, but it still has plenty, and that water is not found only at the poles.

New analysis: It wasn’t even phosphine detected at Venus

The uncertainty of science: A new analysis of the data used by scientists who claimed in September that they had detected phosphine in the atmosphere of Venus has concluded that it wasn’t phosphine at all but sulfur dioxide, a chemical compound long known to be prevalent there.

The UW-led team shows that sulfur dioxide, at levels plausible for Venus, can not only explain the observations but is also more consistent with what astronomers know of the planet’s atmosphere and its punishing chemical environment, which includes clouds of sulfuric acid. In addition, the researchers show that the initial signal originated not in the planet’s cloud layer, but far above it, in an upper layer of Venus’ atmosphere where phosphine molecules would be destroyed within seconds. This lends more support to the hypothesis that sulfur dioxide produced the signal.

When the first announcement was made, it was also noted as an aside that phosphine on Earth is only found in connection with life processes, thus suggesting wildly that it might signal the existence of life on Venus.

That claim was always unjustified, especially because we know so little about Venus’s atmosphere and its alien composition. Even if there was phosphine there, to assume it came from life is a leap well beyond reasonable scientific theorizing.

It now appears that the phosphine detection itself was questionable, which is not surprising since the detection was about 20 molecules out of a billion. And while this new analysis might be correct, but what it really does is illustrate how tentative our knowledge of Venus remains. It might be right, but it also could be wrong and the original results correct. There is simply too much uncertainty and gaps in our knowledge to come to any firm and confident conclusions.

None of that mattered with our modern press corps, which ran like mad to tout the discovery of life on Venus. As I wrote quite correctly in September in my original post about the first results,

The worst part of this is that we can expect our brainless media to run with these claims, without the slightest effort of incredulity.

We live in a world of make believe and made-up science. Data is no longer important, only the leaps of fantasy we can jump to based on the slimmest of facts. It was this desire to push theories rather than knowledge that locked humanity into a dark age for centuries during the Middle Ages. It is doing it again, now, and the proof is all around you, people like zombies and sheep, wearing masks based not on any proven science but on pure emotions.

1 2 3