This forum is about wrong numbers in science, politics and the media. It respects good science and good English.
If a person wanted to average the temperatures over time, wouldn't it be better to average the enthalpies of each location? Is this how they are doing the averages in our great repositories of Temperature data?
Has anyone here looked at the code used to do the averaging. From other discussions, it seems that the averages are more statistical in nature and don't seem to worry about the energy in any location.
In the steam world, everything is enthalpy. I need to go look at an enthalpy table for air and see how it varies over the wet bulb.
I consulted a mollier diagram. I hope the code used to average temperatures doesn't completely ignore it.
How do you average the humidity over the group though. In order to back out the temperature afterwords, you need to make some assumptions. Even then I question why we compile the worlds temperature down to a single number and attempt to gain insight with it.
From what I've read they simply average temperature anomalies. While this goes some way to solving the averaging problem it does depend on having a baseline from which to calculate the anomaly.
Thanks for pointing this out, Brad; we've all felt uncomfortable with this behaviour but I couldn't put my finger on it and say why.
WUWT has commented several times on the effects on records of selectively closing down high-altitude stations, for instance. The anomaly method has the advantage of simplicity, but if the baseline was determined from a certain mix of high- and low-altitude stations and then you alter the mix it's bound to have an effect on any trends.
How easy is it to change temperature and humidity and ? data to enthalpy?
As I understand it the "anomaly" (as invented by James Hansen) is a difference with some constant reference temperature, which amounts simply to a translation of the temperature graph in the vertical axis. So averaging anomalies gives the same result as averaging temperatures and plotting the "anomaly" of the average.
That's my understanding too, and the more I think about it, the more of a total nonsense it seems. I didn't know it was invented by James Hansen. That explains a lot.
IanW & I often bring this point up over at WUWT.
Temperature alone is totally bloody useless as a metric for energy in the system because of humidity variations.
The closest to energy measurement is ocean temperature measurement where the relationship between temp & energy is constant.
Unfortunately, I doubt the capability to consistently measure to the required precision or for that matter a sufficient spread of measurements.
Dave -- It was at WUWT that I first read this. So it is entirely possible that you spurred this post from me. I was just frustrated because I get nonsense responses about it. "Oh, the experts have already considered that!"
Really ... Why the heck aren't the charts in units of Enthalpy then?
If they were 'experts' who understood enthalpy, it should be second nature to use.
That the units didn't make sense to peons like us should be irrelevant.
Apologies for not specifically linking to your comment.
Just to check my memory of this, I went and did a simple average.
30C 90% humidity.
0C 10% humidity.
Simple average gives you
15C 40% humidity
Using the Mollier Diagram, I got about 85 KJ/KG and 0 KJ/KG, for an average of 42.5.
That gives me a potential range of temperatures (assuming that %Hum stays between the two start points) of 16C to 33.5C.
My gut (that terrible instrument of precision) tells me exactly nothing. The affect of merging two such air masses will likely result in rain, making the average completely meaningless.
I really think you may be onto something here Brad.
I am assured however by people in the know in the skeptical community with direct access to experts in the field of Climate Science that Enthalpy isn't an issue.
They are nice about it, but...
Do you really think that the UN fools will know the difference?
They do not know what water(H2O) is!!!
Cancun COP16 attendees fall for the old “dihydrogen monoxide” petition as well as signing up to cripple the U.S. Economy.
Yeah, they signed up to ban water, you’d think some of these folks would be have enough science background (from their work in complex climate issues) to realize what they are signing, but sadly, no.
I've always thought that the biggest problem with the global average temperature is that it's supposed to be the average sea and land temperature, but nobody really seriously bothers to measure the land temperature. Instead they measure the air temperature, actually the shaded air temperature, about a metre or two off the ground. Shaded air temperature is probably a good measure for representing 'the weather', but the temperature of the various ground surfaces below could be completely different. So to me the idea of bringing in enthalpy, which is like a combination of air temperature and humidity in a single parameter, is not really relevant.
To give a quick appreciation of what I'm talking about, this webpage gives an example of some weatherman discussing the issue. He's in Greenland and the shaded air temperature is 20 deg F but the dial thermometer behind his head is reading 80 deg F.
I first noticed this issue about twenty years ago when I had a job working out the 'design hazard parameters' for a nuclear facility in the UK. The procedure for working out the maximum and minimum temperature was to use Met Office data for the nearest weather station location (which might only be a few decades of data) and then massively extrapolate to a 10 thousand year return period. This would give a result like 35 deg C to -20 deg C. When it came to a civil engineer using these figures they would ignore the 35 deg C figure because they used a more conservative design code value (BS5950) for outdoor steelwork, 50 deg C, and they might possibly select a higher grade of steel than normal to handle the -20 deg C low temperature figure.
Now the BS5950 value is probably taking account of the temperature of the steel being in direct sunlight and might be rounded to the nearest 10 deg C. There is a folklore that (even in the UK) on a really hot day you can fry an egg on a metal surface, and that would correspond to a temperature of at least 150 deg F or 65 deg C. The point I'm making is that the temperature of a surface could be completely different to the shaded air temperature, and the temperatures of these surfaces contribute to the Earth's 'thermal radiation budget' or whatever it's called.
Now when it comes to the temperature of the sea, the sea temperature itself actually is measured, not some shaded air temperature above it. It used to be measured (up to about 50 years ago) by hauling in samples of sea water using buckets thrown over the side of a ship, and then they switched to the more 'professional' approach of obtaining temperature data from readings available for the sea water cooling intake temperature for the ships' engines. Both these methods of measuring sea temperature suffer from the problem that they may be measuring temperature quite a few metres below the surface rather than as close to the surface as possible. More recently, I think in the last ten years, the idea of measuring temperature very close to the sea surface using buoys has been introduced.