Tag Archives: data tampering

Suspicious climate data manipulation at NASA

The uncertainty of modern climate science isn’t merely because of the overall complexity of the data and the climate. Though there are numerous factors that contribute to the long term fluctuations of the climate that we do not yet completely understand or can quantify with any precision (the sun, dust, soot, volcanoes, carbon dioxide increase, to name just a few), there is a more tragic uncertainty that global warming scientists at NASA and NOAA have added to the mix, one that is entirely unjustified and harmful to the field of science and the questions that it is trying to answer.

In the case of this post, that tragic uncertainty has to do with sea level rise and the “adjustments,” without explanation, that NASA is making to its sea level data. Below is a graph taken from the link, showing the changes that have been made to published data from 1982 in order to eliminate a long period of almost no sea level rise from the mid 1950s through 1980.
» Read more

A detailed review of the climate data tampering at NASA and NOAA

Steven Goddard has once again taken a close look at the climate data gathering at NOAA and NASA and found clear evidence of tampering.

He not only documents how the scientists at these agencies have adjusted the raw data to cool the past and warm the present to create the illusion of global warming, they have done so with a limited data base.

The bulk of the data tampering is being done by simply making temperatures up. If NOAA is missing data for a particular station in a particular month, they use a computer model to calculate what they think the temperature should have been.

Those calculations are then designed to support the theory of human caused global warming, caused by increased carbon dioxide.

Goddard doesn’t just tell us his opinions, he backs up his conclusions with detailed graphs and data.

Do I accept Goddard’s conclusions entirely? Maybe. The two questions I ask that none of the NOAA or NASA scientists have been willing to answer are these:
» Read more

The distortion of the global surface temperature datasets

Link here. Goddard does a good job of illustrating the differences between the measured and reported climate temperature datasets, and how the reported numbers are consistently shifted to make the past cooler than measured and the present hotter than measured.

He often attributes this bias to dishonest tampering with the data to support the theory of global warming. He might be right, but it is important to remember that you shouldn’t necessarily assign malice to things that are just as easily explained by human error or stupidity. In this case he also notes that almost all the weather stations that have been decommissioned in the past few decades have been located in rural areas. To replace their data, global warming scientists average the data from nearby stations, most of which are in urban areas that exhibit warmer temperatures because city constructs tend to cause local warming. The result? The recent datasets tend to show a strong trend upward.

What is causing a cooling in the datasets from prior to 1970 however is not explained by mere error. The data hasn’t changed. Someone must be deciding to adjust it downward, for reasons that are simply not justifiable.

More evidence NOAA has tampered with climate data

More global-warming fraud: Scientists have uncovered more tampering by NOAA of its climate temperature data to create the illusion that the climate is warming.

When Dr. Roy Spencer looked up summer temperature data for the U.S. Corn Belt, it showed no warming trend for over a century. But that was before temperatures were “adjusted” by National Oceanic and Atmospheric Administration climate scientists. Now the same data shows a significant warming trend.

Spencer, a climate scientist at the University of Alabama in Huntsville, said that the National Climatic Data Center made large adjustments to past summer temperatures for the U.S. Corn Belt, lowering past temperatures to make them cooler. Adjusting past temperatures downward creates a significant warming trend in the data that didn’t exist before. “I was updating a U.S. Corn Belt summer temperature and precipitation dataset from the NCDC website, and all of a sudden the no-warming-trend-since-1900 turned into a significant warming trend,” Spencer wrote on his blog, adding that NCDC’s “adjustments” made the warming trend for the region increase from just 0.2 degrees Fahrenheit per century to 0.6 degrees per century.

As Spencer notes, correcting the data for errors would normally cause adjustments in both directions. NOAA’s adjustments, however, are always in one direction: from cooler to warmer. This suggests manipulation and fraud, not an effort to improve the data. And that they have consistently refused to explain their adjustments in detail further reinforces this conclusion.

More evidence of data tampering at NOAA

A close analysis of NOAA climate data from just one randomly picked Texas rural location reveals significant data tampering to make the climate appear to be growing warming.

In other words, the adjustments have added an astonishing 1.35C to the annual temperature for 2013. Note also that I have included the same figures for 1934, which show that the adjustment has reduced temperatures that year by 0.91C. So, the net effect of the adjustments between 1934 and 2013 has been to add 2.26C of warming. ,,,

So what possible justification can USHCN [the climate data center at NOAA] have for making such large adjustments? Their usual answer is TOBS, or Time of Observation Bias, which arises because temperatures are now monitored in the early morning rather than the late afternoon, which tended to be the practice before. But by their own admittance, TOBS adjustments should only account for about 0.2C.

What about station location? Has this changed? Well, not since 1949 according to the official Station Metadata. Luling is a small town of about 5000 people, and the station is situated at the Foundation Farm, 0.7 miles outside town. In other words, a fairly rural site, that should not need adjusting for urban influences.

It is plain that these adjustments made are not justifiable in any way. It is also clear that the number of “Estimated” measurements made are not justified either, as the real data is there, present and correct.

In doing this analysis, the author, Paul Homewood, does something that Steven Goddard of the Real Science website, the man who broke this story, doesn’t do very often: He carefully illustrates the full raw dataset and shows us how he isolates the raw data from the estimated and adjusted numbers. Goddard generally only shows his results, which means we have to trust his analysis. Homewood therefore approaches Goddard’s results skeptically, and thus acts to check his work to see if it is accurate and correct. He finds that it is.

This is science at its best.

I should also note that I found Homewood’s analysis because Steven Goddard posted a link on his own webpage. As a true scientist, Goddard does not fear a close look at his work. He welcomes it.