All Scientists are Sceptics ~Professor Bob Carter

Whenever someone asserts that a scientific question is “settled,” they tell me immediately that they don’t understand the first thing about science. Science is never settled. Dr David Deming

Perhaps the most frustrating aspect of the science of climate change is the lack of any real substance in attempts to justify the hypothesis ~Professor Stewart Franks

A lie told often enough becomes the truth.
-- Vladimir Ilyich Lenin - See more at: http://thepeoplescube.com/lenin/lenin-s-own-20-monster-quotes-t185.html#sthash.aTrSI3tG.dpuf
A lie told often enough becomes the truth.
-- Vladimir Ilyich Lenin - See more at: http://thepeoplescube.com/lenin/lenin-s-own-20-monster-quotes-t185.html#sthash.aTrSI3tG.dpuf
A lie told often enough becomes the truth.
-- Vladimir Ilyich Lenin - See more at: http://thepeoplescube.com/lenin/lenin-s-own-20-monster-quotes-t185.html#sthash.aTrSI3tG.dpuf

Tuesday, 16 October 2012

Is the temperature or the temperature record rising?

The Australian Environment  Foundation (AEF)  is holding its annual conference on October 20th and 21st 2012 at Rydges World Square Hotel, Sydney (See link)

One of the speakers is NCTCS member Dr David Stockwell. In the AEF program:

Is temperature or the temperature record rising? Dr David Stockwell
The following paper is the basis of his talk at the AEF conference. Previously posted by Anthony Watts on WUWT here.

 

Circularity of homogenization methods

Guest post by David R.B. Stockwell PhD

I read with interest GHCN’s Dodgy Adjustments In Iceland by Paul Homewood on distortions of the mean temperature plots for Stykkisholmur, a small town in the west of Iceland by GHCN homogenization adjustments.
The validity of the homogenization process is also being challenged in a talk I am giving shortly in Sydney, at the annual conference of the Australian Environment Foundation on the 20th of October 2012, based on a manuscript uploaded to the viXra archive, called “Is Temperature or the Temperature Record Rising?”

The proposition is that commonly used homogenization techniques are circular — a logical fallacy in which “the reasoner begins with what he or she is trying to end up with.” Results derived from a circularity are essentially just restatements of the assumptions. Because the assumption is not tested, the conclusion (in this case the global temperature record) is not supported.

I present a number of arguments to support this view.


First, a little proof. If S is the target temperature series, and R is the regional climatology, then most algorithms that detect abrupt shifts in the mean level of temperature readings, also known as inhomogeneities, come down to testing for changes in the difference between R and S, i.e. D=S-R. The homogenization of S, or H(S), is the adjustment of S by the magnitude of the change in the difference series D.

When this homogenization process is written out as an equation, it is clear that homogenization of S is simply the replacement of S with the regional climatology R.
H(S) = S-D = S-(S-R) = R

While homogenization algorithms do not apply D to S exactly, they do apply the shifts in baseline to S, and so coerce the trend in S to the trend in the regional climatology.

The coercion to the regional trend is strongest in series that differ most from the regional trend, and happens irrespective of any contrary evidence. That is why “the reasoner ends up with what they began with”.

Second, I show bad adjustments like Stykkisholmur, from the Riverina region of Australia. This area has good, long temperature records, and has also been heavily irrigated, and so might be expected to show less warming than other areas. With a nonhomogenized method called AWAP, a surface fit of temperature trend last century shows cooling in the Riverina (circle on map 1. below). A surface fit with the recently-developed, homogenized, ACORN temperature network (2.) shows warming in the same region!

clip_image002

Below are the raw minimum temperature records for four towns in the Riverina (in blue). The temperatures are largely constant or falling over the last century, as are their neighbors (in gray). The red line tracks the adjustments in the homogenized dataset, some over a degree, that have coerced the cooling trend in these towns to warming.


clip_image004
It is not doubted that raw data contains errors. But independent estimates of the false alarm rate (FARs) using simulated data show regional homogenization methods can exceed 50%, an unacceptable high rate that far exceeds the generally accepted 5% or 1% errors rates typically accepted in scientific methods. Homogenization techniques are adding more errors than they remove.
The problem of latent circularity is a theme I developed on the hockey-stick, in Reconstruction of past climate using series with red noise. The flaw common to the hockey-stick and homogenization is “data peeking” which produces high rates of false positives, thus generating the desired result with implausibly high levels of significance.

Data peeking allows one to delete the data you need to achieve significance, use random noise proxies to produce a hockey-stick shape, or in the case of homogenization, adjust a deviant target series into the overall trend.

To avoid the pitfall of circularity, I would think the determination of adjustments would need to be completely independent of the larger trends, which would rule out most commonly used homogenization methods. The adjustments would also need to be far fewer, and individually significant, as errors no larger than noise cannot be detected reliably.

No comments:

Post a Comment





All serious comments published after moderation.
Comments should be polite, and respect all views.
No bad language. Spam never makes it!