The Australian Temperature Record Revisited Part 3: Remaining Sites

In Part 1 of this series I showed a 66.6% increase in warming trend of Australian annual minimum temperatures caused by adjustments to the ‘raw’ data.  This was based on analysis of 83 of the 104 Acorn sites, as I restricted my study to only those sites with at least 24 months overlap between old and new stations within 30 kilometres.  I now turn to the remaining sites.

These remaining sites all have records less than the full 103 years, as I only use the longest available record from a single site, with no splicing to form a composite record.  I truncated Acorn and ‘raw’ annual data to exactly the same start and end dates.  Trends calculated over these shorter periods are therefore exaggerated.  As well, some of the records show enormous gaps.  Trends calculated for these sites showed much less warming bias than the 83 I first analysed: the mean difference in trend was +0.26 C, or 26.7% increase.  This is not a meaningful metric, however.  The crucial measure is the effect of the adjustments across the whole network.  To do this, temperature data must be converted to anomalies from the 1961-1990 mean.  This meant the loss of Tennant Creek PO, which has insufficient data in this time period.

Here, then, is the result for 103 of the 104 Acorn sites.  Figure 1 shows the straight mean of minima anomalies for the 103 sites for which data can be compared, ‘raw’ vs Acorn.

Fig.1:103 chart

The adjustments to the ‘raw’ data have the effect of increasing the trend in minimum temperatures from +0.7C per 100 years to +1.03C, or 47%.  Going by this plot, the increase is by nearly half rather than two-thirds- still embarrassingly large.  However, large slabs of data are missing or unaccounted for.  I have zero confidence that the trend in minima is +0.63, +0.7, +1.0, or +1.03, or any other figure, and an average trend for Australia is meaningless given the wide differences in different parts.  By the way, with no stations missing, the warming bias in Victoria is still +350%.

In my next post I will look at some of the ‘outlier’ sites with very large differences in trend.

Tags: , ,

4 Responses to “The Australian Temperature Record Revisited Part 3: Remaining Sites”

  1. Geoff Sherrington Says:

    So the negative and positive adjustments might average out to close to zero if time is ignored; but when you do the time series approach here, there is significant cooling of the old and warming of the new.
    Three questions come readily to mind.
    1. The BoM must be aware of this outcome, yet it does not seem to be publicised. Any ideas why not? Is it the same problem as the now-unfashionable HQ version suffered?
    2. The average of anomaly values (as you present graphically for Tmin above) is not the best estimator for some purposes. Weighted data are better in theory for some purposes with a choice existing about what is weighted, often distance to cover surrounding areas of different extent. But, one might assume that the same outcome would be found, cooling the old and warming the new, though the magnitude might be slightly different depending on choice of parameters. Is it correct that the Acorn data are from gridded, weighted calculations?
    3. There will be cases where some form of adjustment can be justified, even demanded from what is known of the data. The BoM might well argue that all of their adjustments fall into this category. However, adjustment of measured data, especially raw data, is not commonplace in other relevant disciplines. It is more customary to produced documented versions of data, so that the original data are preserved in their untouched form. I have never been able to work out if the Climate Data Online that you are calling ‘raw’ are really raw data, or partly ‘homogenised’ already. Have you ever found that out?

  2. kenskingdom Says:

    Hi Geoff
    1. Don’t know. I believe it is the same old problem inherent in homogenising, except the Acorn method is meant to be objective, not subjective.
    2. The Acorn values in the graph are just straight means. However I addressed this in Part 1, where I showed that the average of anomalies is almost identical to the area weighted values for the Acorn data, so should apply similarly to the ‘raw’ data for exactly the same sites for exactly the same periods.
    3. I part 4 I will look at adjustments at some outlier sites. It is plain that some adjustments are necessary. I am certain that CDO ‘raw’ data are not at all raw. It’s all we have. There are frequent examples of current temperature readings being altered because they are outside expected parameters. This is called ‘quality assurance’ and is the reason many months of daily data from last year and this year is still in italics and monthly and annual values not calculated.

  3. Ian George Says:

    Firstly, congratulations on all your research and I hope it starts to convince others of the problem with our temp record.
    Interested in your comment re ‘some adjustments are necessary’.
    When the bush fires hit Dunnaley (4th Jan, 2013) the temp recorded was around 58C. However, there is no temp recorded for that day on Dunalley’s CDO site (and rightfully so).
    It will be interesting as to what they will do now adjustment-wise – or they may just leave it blank. Being a short-term w/s, they may not bother.
    Now that Watts et al are finally convinced about Steve Goddard’s research into ‘dodgy’ US data, maybe we can start looking at our own.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: