Return to Website

Number Watch Web Forum

This forum is about wrong numbers in science, politics and the media. It respects good science and good English.

Number Watch Web Forum
Start a New Topic 
Adjusting Data

A discussion of the transformations of MSU data to make it work with surface data.

Does it make me a Hillbilly that I get concerned when people try to force measurements to fit their models. I understand that some transformation has to be done to get any analysis, but I continue to be a strong proponent of looking at data in the units it arrives in, NOT the "anomalies" that we are presented.

I perceive the utility of anomalies, but in my perception the utility is mostly useful to someone who is watching a production line. I expect 23 inch Widgets and I have a meter that shows me the length of the widgets coming off the line. Making a guage that points straight up when it is 23 inches makes it real easy to see when something is wrong. Making 23 be 0 isn't a big deal. If I chart that and then show it to the world as "Widget Anomalies", it becomes almost meaningless.

This is to say that the graphs of he MSU anomalies are equally deceiving. More useful and honest would be to show us a graph of actual recorded values. Of course showing us actual values results in graphs that look like (taken from the The Real Inconvenient Truth). This is a snoozer of a chart (although more accurate than most). Its a snoozer cause their is nothing there that makes you want to run for the fallout shelter. The only way to make that information look menacing is to take out its magnitude and strip it of its variability and then pick a starting point that is favorable to your cause. Then you have something with which you can scare the kiddies.

If any of the above sounds like it was stripped from the pages of Numberwatch or SWN, hopefully Jeb will forgive me.

Re: Adjusting Data

In the right place this time..
Contrast the CO2 finding with this one

And ask why the CO2 is 20 year smoothed. Won't that wipe out any evidence of the 1940 peak which if Beck's data is true? And if there was a 550 count in 1940, doesn't that mess up the whole AGW theory? I've never liked that ice core data anyway, I don't know why they have to cook it to make it match for 1957, and if so how they can know it's good for thousands of years ago. I don't think paleoclimatology is a very precise science, somehow.

Re: Re: Adjusting Data

Smoothing reminds me of a job I had. We collected data from power plants and presented it to the operators as Efficiencies (i.e. how efficient a turbine was running).

We "smoothed" our data also. By time weighting the data, we avoided "spikes" in our output. It was possible to get a faulty reading from an instrument (due to any of the steps involved in the process of taking that reading). Smoothing (time weighting), made those hiccups disappear. Those readings also had variability in them. Some of the calculations (around boilers) were sensitive to temperature and mass flow rate inputs (given that these are the two major constituents of a heat balance, that would seem absurdly obvious).

All this is to say that "smoothing" has its place. Most specifically when dealing with the vagaries of real time systems trying to mesh with first principle calculations.

I continue to question the validity of such smoothing in non-realtime analysis. A time versus temperature/time versus concentration scatter graph would probably be more honest. Trouble with such a graph is that they start to look like big fat lines (especially in the case of temperature).

Now I catch myself say "There ought to be a law" about graphing.

But I will keep my mouth such. The unintended consequences of a Minister of Graphing and the resultant police force cause me to quiver in fear.

The real answer is of course educating our children to understand graphs. That is not easy. My children will be okay, but I am not sure about my friends and neighbors.