This forum is about wrong numbers in science, politics and the media. It respects good science and good English.
Have to wonder if these data trend analysis types have ever heard of physics or actually stepped outside their ivory tower to see what happens in the real world.
As any Brit will know the global temperature is stabilised by cloud reflection. Actual yearly insolation is roughly half what it would be without clouds. If the temperature rises you get more cloud which reflects more solar radiation back before it gets to the ground counteracting the rise. If temperature falls you get more sola radiation reaching the ground helping to warm things back up. Cloud reflection also helps keep things warm overnight. Cold clear winters night = icy roads in the morning. Cloudy winter night maybe not.
All the trend analysis stuff is basically rubbish anyway as it assumes infinite resources for the process which pretty much never happens in physics. Modern version of how many angels can dance on a pinhead.
One area where the WUWT blog post by Mikhail Voloshin is a bit misleading is that it gives the impression that this is a recent development that is coming from the financial industry, an industry which is much more wary of the idea of being fooled by randomness, and Voloshin (and other people who have worked in the financial industry like Doug Keenan) are imparting this knowledge to the general scientific community.
The first time the random walk idea was put forward was in 1991 in this free-to-view paper called "Global Warming as a Manifestation of a Random Walk":
random walk paper
Matt Briggs (or William Briggs, if you prefer) wrote a blog post in 2008 about the 1991 paper and his post includes the 'R' coding or commands ('R' is a free statistical analysis program that is used by quite a few AGW sceptic bloggers) to generate random walk anomaly temperatures versus time data yourself. The whole thing can be done in one command line on R - rnorm finds a random number from a normal distribution, and cumsum calculates the cumulative sum. I remember trying out the R coding (as I had a copy of R installed on a PC), and it only takes a few trials before you get what appears to be a disturbing trend, like a rise of 3 deg C over a century.
Back in the 1970s when global cooling was in vogue, it seemed to be implicitly assumed that climate was random. Random fluctuations of temperature were going to take us into a new ice age, and it was never made clear whether this was going to be a full-blown Ice Age with extensive glaciers, or just another Little Ice Age. The first time I got the idea that climate wasn't random, and that the future climate could potentially be predicted, was from the global warming brigade in the mid 1980s when the greenhouse effect started being talked about.