This forum is about wrong numbers in science, politics and the media. It respects good science and good English.
Zooming in to see the variation in data is perfectly acceptable practice in the right place.
To take a slightly silly example 1/4 inch difference in the length of one leg of a table makes it wobble just as much regardless of whether its on the top or ground floor of a skyscraper. Hence its perfectly valid to zoom in when investigating the amount of table top wobble against leg length variation in your penthouse lab. In practice of course you'd not use absolute table top height measured from sea level and zoom in. You'd simply measure from floor level as a local reference subtracting the large off set which necessitates zooming in if using absolute height. A perfectly valid and accepted thing to do in this case, although maybe not if the skyscraper were very tall and you had a gravitationally sensitive experiment on top of the wobbly table.
In thermal work its frequently undesirable to subtract the offset. I spent a fair number of years in thermal imaging research and became used to the very small intensity modulation forming the picture floating on a very large background thermal offset. For some purposes and calculations it was quite acceptable to use the picture modulation alone after subtracting the offset whilst for others the full range data was essential. Even though the modulation might be less than 1 % off the offset it was never acceptable to consider it so small as to be irrelevant. Especially and obviously when attempting to reduce thermal signatures.
I suspect that the graph in question uses full range values and zooms in to show the variation mainly because there is no accepted method of clearly removing the offset without potential understanding error. I also suspect that there may be an element of trying to show how small the variation is.
The bit that intrigues me is how such accurate data can be obtained so far back in the past. I shall have to look into this. I must also make another effort to find out if any of the CO2 warming calculations actually bother to take into account the fact that the absolute maximum power/energy available to drive warming is defined by the area under the requisite black body curves falling within the CO2 absorption bands. In practice 40% of the black body maximum in these regions is more reasonable.
I notice that you are another fan of CricketGraph, which ranks with ClarisDraw as a classic "good enough for real people" computer program. How does Chartsmith compare in curve fitting capabilities. For me the curve fitting ability was what raised CricketGraph from good to can't do without status.
I think there are definitely times that it is useful to Zoom in. To change the analogy, it is okay to look at the tree in the forrest and identify what is happening to the tree. You just have to remember that there is a forrest around you. If you carefully studying the tree and discovering exactly what makes it grow, what makes it sick and what makes it breathe better, this is great science, but it should be remembered that you are studying the tree and not the forest. The forest could be expanding by leaps and bounds and you won't know it cause you are looking at the tree and the edge of the forrest is 10 miles away.
Another example which may or may not be relevant. Discovering a planet around another star. I am not specifically familiar with the process, but I could imagine that "zooming" might be useful here also. To find the planet you have to examine the perturbations of gravity on a small scale. After finding the planet in the distant system though, will adding it to the equations for planning space missions make the calculations more accurate? I think it is great that we can detect these anomalies in gravity to such an extent that we can identify the pull of a small planet on a massive object. I do not think that being able to identify tiny perturbations in anything means that we can ignore gross perturbations, especially when we talk about bulk properties like temperature and solar flux.
In Mechanical Engineer, I was introduced to a concept called Unitless analysis. The irony of the subject is that it everything to do with units. If you want to compare a boat 3 ft long with a boat 12m long for design purposes, you have to take into account the units of the materials you are working with. The relative fluid viscosity is different for a 3ft model of a 12m boat. If you are testing on 3ft model it is necessary to change the viscosity of the water in which you are testing. In the climate world though, their method for comparing apple's to oranges to is to adjust the ranges so the magnitudes of visual amplitudes are the same.
Looking at absolutely based charts would give most of us the information we need though. You look at chart and see no changes and you can say to yourself.. Hmm, guess not much to worry about. That is why we use graphs isn't it, to share data in a way that allows others to make quick assessments of what is happening without having to poor over piles of data sheets.
The $64,000 question over zooming in is always whether on not the variable detail on top of the offset is important or not. Sometimes even quite small variations can have large effects especially when compounded over a period and acting as the trigger for helper effects.
Actually it can be useful to produce offset or zoomed in graphs with the same visual scale of variation but different absolute scales. Simple reference to the axes helps judge the relative importance or sensitivity to different effects which can be clearer than the full scale representation loosing the "unimportant" detail. Which may not be so unimportant.
I wonder if anyone has actually tried to do an experiment to see what the effect of CO2 and other greenhouse gases are in a lab situation. I imagine something could be done along the lines of enclosing a suitable surface and atmosphere in a transparent dewar illuminated by suitable radiation. Obviously it would be impossible to simulate the proper free space situation with solar equivalent black body radiation but examining the effect of broad line illumination in the dewars transparency window is quite possible. It would also be possible to examine the effect of illuminating at selected absorption band using suitable lasers. I imagine that an internal electric heater would be needed to establish the baseline temperature. Calibrating the system would not be trivial but none of the losses are terribly hard to deal with. It would at least give realistic estimates for the extra energy capture in the chosen absorption bands.
Once again I can't disagree too much.
Looking at the magnitudes from a distance though to start off IS useful.
A former professor of mine at Oregon State described some research that was being done in the boundary layer. He explained that they were able to transfer huge amounts of energy into water by making the orifices smaller than the boundary layer. I walked away impressed thinking about the application of such technology to water heaters and the like. Then I asked a silly question "How could this be used in real life". My experience includes Naval Nuclear Reactors. The chemistry of hot water is challenging. Some of the most important work done in the Engineering Section is making sure that the system doesn't fail due to corrosion issues. Start pumping water through nano tubes and you are going to find that those tubes are clogged really quickly.
The big picture can never be forgotten.
After looking at JEB's webpage on 'Chartmanship', I think you may be overlooking the issue of the educational level of the audience, Brad.
JEB's concern seems to be the use of chartmanship to fool the 'average lay reader' who is not familiar with graphs, and particularly where graphs are used in newspapers. Now if we look at the webpage where the graph has probably come from (based on the URL) on the Junkscience.com website:
I wouldn't say that the Junkscience webpage is intended to be read by the 'average lay reader', it looks like it would only be understood by somebody qualified to work in some sort of technical occupation.
On the Chartmanship webpage, the global average temperature anomaly curve is used as an example of 'advanced chartmanship'. I've always thought that public support for being concerned about global warming in the UK would show a significant drop if the actual temperature reference level was used in global average temperature graphs, which I understand is about 14 deg C. In England the average annual temperature is 8.5 to 11 deg C, so the sometimes mentioned political target of avoiding a 2 deg C temperature rise wouldn't even take the UK up to the current world average temperature.
On a final note. I've got a feeling that Microsoft are responsible for introducing this comparatively new word 'chart' to mean a 'graph'. I think it comes from terminolgy used in MS Excel.
The effect of doubling atmospheric CO2 from pre-industrial levels is 'consensed' at around 4 W/sqm, so the graph portraying the solar effect on W/sqm correctly operates on the same order of magnitude.