Sunday, 6 May 2012


                                  by Expert IPCC reviewer Dr Vincent Gray

APRIL 29th 2012



I come from a generation where scientists measured properties by means of an instrument and recorded the results in a notebook.

 I ran a weather station on the school roof from 1937-1939 at Latymer Upper School in Hammersmith, London. It was  the idea of the Physics master, L R Middleton. We bought equipment from Griffin and Tatlock, We had a maximum/minimum thermometer, a wet and dry bulb thermometer,, a screen to hold them in, a rain gauge and a wind direction and strength indicator. The pride of the set was the sunshine recorder which consisted of a beautiful glass sphere mounted in a frame. The sun burned a trace on a piece of blue card that told you the number of hours.

We were very keen. We took it in turns to come in at week ends and holidays and the measurements were put up on the bulletin board, plus a weather forecast every day.

All scientific measurements of the period were made in this way and in some places they still are. We would often make our own apparatus, sometimes using Meccano and we might even make our own instruments like an infra red analyzer or am ammeter. When I was at the Institut Pasteur in Paris in 1947 I supervised the construction of an apparatus for measuring flow birefringence of liquids.

Slowly the observer has become separated from the actual measurement . You buy a black box which carries out the measurement  The result gets put on a "data logger" and then into a computer. Often the person who studies the result has never seen what is being measured, and even if he had seen it he cannot check it continuously. Then, instead of processing the measurements in such a way as to check the compliance with a theory, the computer does it for you and you are now completely out of touch with the actual situation that has been measured. What comes out of that computer is now called "data". It is no longer possible to check what is being measured, whether it has changed, or what process has been carried out by the computer. It is easy to forget the errors, bias and uncertainties involved in each part of the process: whether the instrument been calibrated,  how accurate was it? Maybe the system was set up by a consultant or by a previous worker.

This applies when the same measurement is being made over a period in the same place. The problems expand when there are compilations of measurements from several places by different equipment and where there is a different number of observations in different places every time there is a calculation. .

There are no "data" from climate measurements which do not suffer from these handicaps. They are usually at least some sort of average, often from  a skewed distribution of observations. With the Mean Annual Global Surface Temperature Anomaly Record it involves averages of averages subtracted from more averages. Yet most of these data are presented as if they are constants and the estimated uncertainties are unbelievable even when they appear.

Ordinary mathematical statistics cannot be used with these sort of "data" because they do not comply with the necessary assumptions that they must be uniform and symmetrical.. This includes the facilities supplied on "scientific" calculators and computer spreadsheets, such as the calculations of mean and standard deviation and linear regression..

 "Bayesian" statistics which is sometimes offered  instead, is merely a method of turning  one wrong result  into another wrong result.

The problems have long been realised by traditional meteorologists and they have done their best to cope with them. The arrogance of so-called climate scientists that hey have overcome them, is unwarranted.

Vincent Gray
Wellington 6035

"To kill an error is as good a service as, and sometimes better than, the  establishing of a new truth or fact" ~ Charles Darwin