Archive for the ‘temperature’ Category

Australian Temperature Data Are Garbage

September 14, 2017

From the Bureau’s hastily published “Fast Facts”:

“This means that each one second temperature value is not an instantaneous measurement of the air temperature but an average of the previous 40 to 80 seconds.”

That is complete nonsense.

At the end of each minute, the following data are recorded:

  1. Lowest one second reading of the previous 60 seconds
  2. Highest one second reading of the previous 60 seconds
  3. Reading at the final second of the minute.

Firstly, 40 seconds is not one minute, the integration period recommended by the WMO in 2014 and by the Bureau’s own officers in 1997.  Anything less than 60 seconds is not compliant.

Secondly, consider this plot, which is from actual 1 minute temperatures recorded at Hervey Bay Airport on 22 February 2017.  (Data purchased by me from the Bureau).

Fig. 1:

Hervey Bay 1 min 5 to7am 22 Feb

Sunrise was at about 5:40 a.m.  Temperatures do not increase until about 6:30 a.m.  Note the strangely low temperature- the daily minimum- which was reported as occurring sometime in the 60 seconds before 06:00:00.  The BOM would have us believe that each of the values in Figure 1, including the low of 23.2C, are “averages” of the previous 40 to 80 seconds.

Next consider what happens in that minute from 5:59 to 6:00, as per the following plot.

Fig. 2:

Hervey Bay 1 min 0559 to 0600am 22 Feb

We don’t know in which seconds the high and low readings for that minute occurred, so I have shown them for each of 59 seconds.  I have shown the 5:59 and 6:00 readings: both were 25.3C.

Consider how the value at 06:00 was obtained:

If by an “average” (however derived) of less than 60 seconds, the methodology is non-compliant.

If by an “average” of the previous 60 seconds, it must include values that contributed to the High of 25.4C and the Low of 23.2C.

If by an “average” of anything greater than 60 seconds, it must include values that contributed to both the Low and High values, and as well, values that contributed to the 5:59 reading- which is the same as the 06:00 reading.

Similar logic applies to the Low and High readings.

It follows that the intermediate instantaneous atmospheric temperatures that contributed to all three reported “average” values must have ranged from much higher than 25.4C to very much lower than 23.2C.

Look at Figure 1 again.  The air temperature at Hervey Bay on 22 February must have spiked down very much lower than the 23.2C plotted.

Really?

In the early morning there is very little near ground turbulence so temperatures do not fluctuate from one minute to the next by very much.  In How Temperature Is Measured in Australia Part 2 I showed that 91% of low temperatures vary from final second temperatures in the same minute by 0.2C or less.  A difference of 2.1C is extraordinary.  Fluctuations greater than that are difficult to believe.

However, in a comment at How Temperature Is Measured In Australia Part 1, Tony Banton, a retired meteorologist, says that the BOM explanation of cooler ground level air mixing upwards is correct.  If we accept that explanation, we must then face the problem of “comparability”.

In 61 seconds, the Hervey Bay AWS has reported temperatures of 25.3, 25.4, 23.2, and finally 25.3 degrees.  The BOM asserts that a liquid-in-glass thermometer will be able to respond as quickly and show similar temperatures- and remember, 23.2C was the morning’s official minimum.

My response: rubbish.  The data for 22 February at Hervey Bay show that no averaging is used at all, and the Low Temperature of 22.3C  23.2C is an instantaneous one second recording from a rogue downwards spike, whatever the cause, whether a natural event or other (e.g. electrical) factor.

Temperatures reported by the BOM are not fit for purpose of accurate reporting of maxima and minima, identifying records, or identifying warming or cooling by comparison with historic liquid-in-glass data.

Advertisements

Watch an AWS Fail

August 30, 2017

(With thanks to Lance, Phill, and others)

A week ago, a colleague alerted me to strange behaviour at an Automatic Weather System at Borrona Downs in NSW.  This is a brand new weather station, with its first observation on 21 July.

Phill writes in an email:  Do you ever wonder why you get a shiver down your spine?  Pity the poor folks in the NSW far west.  

 From this mornings (20th  August) NSW observation list: The minimum temperature at Borrona Downs AWS was -62.5C at 9:59pm last night.  Probably some clowns with a bucket of dry ice or liquid nitrogen.  Perhaps Odin’s host crossed the night sky or maybe death just walked on by…  The individual reads don’t show anything lower than -37.5C also at 9:59 so the cold spike was quite sudden.  It went from -62.5C sometime between 21:58:00 and 21:59:00 to -37.5C at exactly 21:59:00 to -4.4C at 22:00:00.

I was too busy and preoccupied until now to follow this up, but I have a few days now.

Borrona Downs Station is in sandhill and claypan country in the far northwest of NSW:

Borrona Dns map

Borrona Dns aerial

Here is the Climate Data Online minima record (note minima indicated on two days):

Borrona Dns cdo

The following plots show the deterioration in the performance of the AWS.  Firstly, the comparison with Tibooburra, 110km away, showing a sudden change at 29 July:  Subtracting Borrona Downs data from Tibooburra shows that Borrona Downs Tmin is too high from this date.  The whole (brief) record should be scrapped.

Borrona Dns Tibooburra comp

But the devil, as Phill found, is in the detail.  Here is part of the record for the 19th:  Note the Low Temp at 9.59 pm, and I have indicated the official minimum for the day which would have occurred early that morning.

Borrona Dns 19 Aug

The Bureau has the minimum at 4.6C, but how was this value obtained?  The erroneous values, (including that of liquid nitrogen), are flagged, then manually removed, and the previous lowest temperature is retrieved from the one minute data for the day.  This also happened on the 26th:

Borrona Dns 26 Aug

Things got much worse on August 27th:

Borrona Dns 27 Aug

Why could no minimum be found?  Did the BOM realise that none of the data were reliable, and were essentially random errors?  Remember that the AWS records values every second, and the highest, lowest, and final second values for each minute are stored.  My guess is that many of these values were unreliable as well, even though many of the final second half hour values seem reasonable- for example 4.4C at 5.30 am.

This continued on August 28th   with an all time low of -69.5C:Borrona Dns 28 Aug

And the BOM ceased reporting values at 3:30 pm.

This description of events was confirmed by the Bureau’s response to a query:

“Do you know what is causing the very low temperature recordings?

There is a hardware fault within the AWS which is generating spurious values. The Bureau’s technicians are investigating but a site visit will be required.

Why was the August 19 low temperature recording not left blank?

Manual quality checking confirmed that the spiking on 19 August did not occur near the minimum  temperature for that day, as a result, the minimum temperature was recorded.”

This begs the question: is this what happened at Goulburn Airport on 2 July ? The initially reported figure of -10.4C was flagged as suspicious, so the previous low temperature of -10C was then reported, then this was removed , then the initial -10.4C was reinstated.  Perhaps.

-10.4C certainly should not have been flagged as too low for that location, as many other  values below 10C have been observed, including the record -10.9C recorded on 17 August 1994.  However, perhaps it was flagged as suspicious by comparison with the series of values before and after: too large a change in temperature from second to second.  But if so, why didn’t the BOM CEO just say so, instead of getting tangled in a web of conflicting explanations?

The AWS at Borrona Downs has failed.  So has the Bureau of Meteorology.

 

Garbage In, Garbage Out

August 7, 2017

(By Ken Stewart, assisted by Bill Johnston and Phill Goode; and cross-posted with Jo Nova)

Early ABC Radio news bulletins last Wednesday morning were led by this item, which you can read in full at ABC Online.

More climate scientists needed to avoid expensive mistakes, review urges

Apparently we urgently need 77 climate scientists to predict the future of areas like the Murray-Darling Basin with climate modelling.

Interestingly, Professor McDougall of the Australian Academy of Science points out that one of those “expensive mistakes” was the $2 billion desalination plant built in Queensland as a response to the millennium drought, “which really wasn’t an indication of climate change at all”.   Why didn’t the good professor raise his voice before the money was wasted?

But I digress.

Reliable modelling and projections for the future are surely desirable.

But such modelling must be based on reliable data, and the reliability of temperature data in Australia is demonstrably poor.

Example 1:  As has been widely reported in The Australian, and by Jennifer Marohasy and Jo Nova, cold temperatures at two separate sites (and possibly many others) were altered to appear warmer, then changed back, then deleted.  The BOM gave two conflicting explanations, both of which cast grave doubt on the reliability of “raw” temperature data from an unknown number of stations.

Example 2:  After enquiring why there are frequently two different temperature readings for exactly the same minute at various weather stations, a Bureau spokesperson told me that:

Firstly, we receive AWS data every minute. There are 3 temperature values:
1. Most recent one second measurement
2. Highest one second measurement (for the previous 60 secs)
3. Lowest one second measurement (for the previous 60 secs)

(See here and here.)

In other words, Australian maximum and minimum temperatures are taken from ONE SECOND readings from Automatic Weather Stations.  Spikes due to localised gusts of hot air, or instrument error, become the maximum for the day.  (This rarely has a large effect on minima, as night time temperatures are fairly smooth, whereas during the day temperature bounces rapidly up and down.  This is shown in this plot of temperatures at Thangool Airport in Queensland on Australia Day this year.)

Thangool 26 Jan 17 1 min

And this is for the same day between 3.00pm and 4.00pm.

Thangool 26 Jan 17 3 to 4pm

As you can see the temperature spikes up and down in the heat of the day by up to one degree between one minute and the next.  But these are the temperatures at the final second of each minute: during the intervening 59 seconds the temperature is spiking up and down as well, which we know because occasionally the highest or lowest temperature for the day occurs in the same minute as a final second recording on the BOM database (usually on the hour or half hour).  This can be up or down by two or three degrees in less than 60 seconds.

This is in contrast to the rest of the world.  The WMO recommends 1 minute (60 second) averages of temperature to be recorded to combat this very problem of noisy data, and this is followed in the UK.  In the USA 5 minute (300 second) averages are calculated.

From THE WEATHER OBSERVER’S HANDBOOK by Stephen Burt (Cambridge University Press, 2012):

Observers handbook

Even without software or human interference as in Example 1, this means Australian temperature data, in particular maxima, are not reliable.

Example 3:  Historically, temperatures were observed from Liquid In Glass (LIG) thermometers.  From the 1990s, Automatic Weather Stations (AWS) were installed using Platinum Resistance Thermometers (PRT) and are now the source for daily data.  AWS thermometers are very precise, but as I showed in Example 2, their data is used idiosyncratically to record 1 second spikes, frequently resulting in higher maxima and less often slightly lower minima than a 1 or 5 minute average.

One would think that with such a major change in technology there would be comparative studies reported in the BOM’s meteorological journal or other “peer reviewed” literature.  Apparently not.

Dr Bill Johnston has investigated this and says:

Parallel data were collected all over Australia for over a decade, some until last year when thermometers were removed, at manned sites, mainly major airports (Ceduna, Sydney, Hobart, Perth, Darwin, Alice Springs, Albany, Norfolk Island, Wagga to name a few) and also met-offices such as Cobar and Giles. However, comparisons between screens were done at one site only (Broadmeadows, Melbourne, which is not even an official weather station) using PRT only and reported as a “preliminary report”, which is available (https://www.wmo.int/pages/prog/www/IMOP/WebPortal-AWS/Tests/ITR649.pdf) however, after AWS became primary instruments, as I’ve reported before, the Bureau had an internal policy that parallel liquid-in-glass thermometer data were not databased. Furthermore, they had another policy that paper-data was destroyed after 2-years. So there is nothing that is easily available…. there is also no multi-site replicated study involving screen types and thermometers vs. PRT probes ….

Deliberate destruction of data is scandalous; the only way now to compare Automatic Weather Stations (AWS) and Liquid in Glass, is to hunt for sites where there is overlap between two stations; where the AWS is given a new number. This is possible BUT the problem is that the change-over is invariably confounded with either a site move or the change to a small screen.

Therefore we suspect that the introduction and reliance on AWS has led to artificially higher maxima (and thus record temperatures) than in the past, but we have no way of knowing for sure or how much.

So we now have (1) temperatures that are altered before they even become ‘raw’ data; (2) use of one second spikes for recording daily maximum and minimum temperatures, very probably resulting in artificially high maxima and slightly lower minima; and (3) no way of telling how the resulting data compare with those from historical liquid-in-glass thermometers.

How can the CSIRO hope to produce reliable climate modelling with any number of climate scientists when the BOM cannot produce reliable temperature data?  Garbage in, garbage out.

The Pause Update: May 2017

June 7, 2017

The complete UAH v6.0 data for May have been released. I present all the graphs for various regions, and as well summaries for easier comparison. I also include graphs for the North and South Temperate regions (20-60 North and South), estimated from Polar and Extra-Tropical data.

The Pause has ended globally and for all regions including the USA and the Southern Hemisphere, except for Southern Extra-Tropics, South Temperate, South Polar, and Australia. The 12 month mean to May 2017 for the Globe is +0.35 C – down 0.01C from April.

These graphs show the furthest back one can go to show a zero or negative trend (less than 0.1 +/-0.1C per 100 years) in lower tropospheric temperatures. I calculate 12 month running means to remove the small possibility of seasonal autocorrelation in the monthly anomalies. Note: The satellite record commences in December 1978- now 38 years and six months long- 462 months. 12 month running means commence in November 1979. The y-axes in the graphs below are at December 1978, so the vertical gridlines denote Decembers. The final plotted points are May 2017.
[CLICK ON IMAGES TO ENLARGE]

Globe:

Pause May 17 globe

The Pause has ended. A trend of +0.46 C/100 years (+/- 0.1C) since March 1998 is creeping up, but the 12 month means have peaked and are heading down.

And, for the special benefit of those who think that I am deliberately fudging data by using 12 month running means, here is the plot of monthly anomalies:

Pause May 17 globe mthly

Northern Hemisphere:

Pause May 17 NH

The Northern Hemisphere Pause has well and truly ended.

Southern Hemisphere:

Pause May 17 SH

The Pause has ended but temperatures for the last 19 years are rising very slowly.

Tropics:

Pause May 17 Tropics

The Pause in the Tropics (20N to 20S) has ended and the minimal trend is now +0.47C/ 100 years. 12 month means are dropping fast.

Northern Extra Tropics:

Pause May 17 NExt

Northern Temperate Region:

Pause May 17 NTemp

Using estimates calculated from North Polar and Northern Extra-Tropics data, the slowdown is obvious.

Southern Extra Tropics:

Pause May 17 SExt

The Pause has weakened but still persists, and 12 month means have peaked.

Southern Temperate Region:

Pause May 17 STemp

Using estimates calculated from South Polar and Southern Extra-Tropics data, the Pause likewise persists.

Northern Polar:

Pause May 17 NP

The trend has increased rapidly and will continue to do so even though 12 month means have started to fall.  The horizontal black lines show the pause for the first 16 years, and the pause from 2003 – 2015.  The strong trend in Arctic temperatures is due to a step change from 1995 – 2002, and the strong 2015 – 2016 El Nino.

Southern Polar:

Pause May 17 SP

The South Polar region has been cooling (-0.14C) for the entire record. Although the 12 month means may have peaked, this cooling trend will slow over the next few months, and Global Warming Enthusiasts may start to get excited.

USA 49 States:

Pause May 17 USA49

The Pause has ended. It will not re-appear for some time.  And by the way, that is almost entirely due to Alaska: here’s the plot without Alaska:

Pause May 17 USA48

Paused!  But that could disappear as well.

Australia:

Pause May 17 Oz

The Pause has shortened dramatically, but is still 19 years 9 months- over half the record.   And the trend since September 1995, two years longer, is less than +0.2C.

The next graphs summarise the above plots. First, a graph of the relative length of The Pause in the various regions:

Pause Length May 17

Note that the Pause has ended by my criteria in all regions of Northern Hemisphere, and consequently the Globe, and the Tropics, but all southern regions have a Pause for over half the record, including the South Polar region which has been cooling for the whole record. Note that the Tropic influence has been enough to end the Pause for the Southern Hemisphere.

The variation in the linear trend for the whole record, 1978 to the present:

Pause May 17 trends 78

Note the decrease in trends from North Polar to South Polar.

And the variation in the linear trend since June 1998, which is about halfway between the global low point of December 1997 and the peak in December 1998:

Pause May 17 trends 98

For 19 years “global” warming has been dominated by the influence of the Tropics and North Polar regions.

The imbalance between the two hemispheres is obvious.

The lower troposphere over Australia has been strongly cooling for those 19 years- over half the record.

The Pause has disappeared from the USA and the Southern Hemisphere, but not the Southern Extra-Tropics, South Temperate, and South Polar regions, or Australia. El Nino tropical heat is rapidly decreasing, with all means falling. The next few months will be interesting.

How Temperature is “Measured” in Australia: Part 2

March 21, 2017

By Ken Stewart, ably assisted by Chris Gillham, Phillip Goode, Ian Hill, Lance Pidgeon, Bill Johnston, Geoff Sherrington, Bob Fernley-Jones, and Anthony Cox.

In the previous post of this series I explained how the Bureau of Meteorology presents summaries of weather observations at 526 weather stations around Australia, and questioned whether instrument error or sudden puffs of wind could cause very large temperature fluctuations in less than 60 seconds observed at a number of sites.

The maximum or minimum temperature you hear on the weather report or see at Climate Data Online is not the hottest or coldest hour, or even minute, but the highest or lowest ONE SECOND VALUE for the whole day.  There is no error checking or averaging.

A Bureau officer explains:

Firstly, we receive AWS data every minute. There are 3 temperature values:
1. Most recent one second measurement
2. Highest one second measurement (for the previous 60 secs)
3. Lowest one second measurement (for the previous 60 secs)

Relating this to the 30 minute observations page: For an observation taken at 0600, the values are for the one minute 0559-0600.

Automatic Weather Station instruments were introduced from the late 1980s, with the AWS becoming the primary temperature instrument at a large number of sites from November 1 1996.  They are now universal.

An AWS temperature probe collects temperature data every second; there are 60 datapoints per minute.  The values given each half hour (and occasionally at times in between) at each station’s Latest Weather Observations page are samples: spot temperatures for the last second of the last minute of that half hour, and the Low Temp or High Temp values on the District Summary page are the lowest and highest one second readings within that minute of reporting.  The remaining seconds of data are filtered out.  There is no averaging to find the mean over say one minute or ten minutes.  There is NO error checking to flag rogue values.  The maximum temperatures are dutifully reported in the media, especially if some record has been broken.  Quality Control does not occur for two or three months at least, which then just quietly deletes spurious values, long after record temperatures have been spruiked in the media.

In How Temperature is “Measured” in Australia: Part 1 I demonstrated how this method has resulted in large differences recorded in the exact same minutes at a number of stations.

What explanation is there for these differences? 

The Bureau will insist they are due to natural weather conditions.  Some rapid temperature changes are indeed due to weather phenomena.  Here are some examples.

In semi-desert areas of far western Queensland, such as in this example from Urandangi, temperatures rise very rapidly in the early morning.

Fig. 1:  Natural rapid temperature increase

urandangi

For 24 minutes the temperature was increasing at an average of more than 0.2C per minute.  That is the fastest I’ve seen, and entirely natural- yet at Hervey Bay on 22 February the temperature rose more than two degrees in less than a minute, before 6 a.m., many times faster than it did later in the morning.

Similarly, on Wednesday 8 March, a cold change with strong wind and rain came through Rockhampton.  Luckily the Bureau recorded temperatures at 4:48 and 4:49 p.m., and in that minute there was a drop of 1.2C.

Fig. 2:  Natural rapid temperature decrease

Rocky 8 March

That was also entirely natural, and associated with a weather event.

For the next plots, which show questionable readings, I have supplemented BOM data with data from an educational site run by the UK Met Office, WOW (Weather Observations Worldwide).  The Met gets data from the BOM at about 10 minutes before the hour, so we have an additional source which increases the sample frequency.  The examples selected are all well-known locations in Queensland, frequently mentioned on ABC TV weather.  They have been selected purely because they are examples of large one minute changes.

This plot is from Thangool Airport near Biloela, southwest of Rockhampton, on Friday 10 March.  The weather was fine, sunny, and hot, with no storms or unusual weather events.

Fig. 3:  Temperature spike and rapid fall at Thangool

Thangool 10 march

This one is for Coolangatta International Airport on the Gold Coast on 20th February.

Fig. 4:  Temperature spike and rapid fall at Coolangatta

Coolangatta 20 Feb bom met

And Maryborough Airport on 15th February:

Fig. 5:  Temperature spike and rapid fall at Maryborough (Qld)

Mboro 15 Feb

Figure 5(b):  The weirdest spike and fall:  Coen Airport 21 March 

Coen 21 March

Thanks to commenter MikeR for finding that one.

All of these were in fine sunny conditions in the hottest part of the day.  It is difficult to imagine a natural meteorological event that would cause such rapid fluctuations- in particular rapid falls- as in the above examples.  It is possible they were caused by some other event such as jet blast or prop wash blowing hotter air over the probe during aircraft movement, quickly replaced by air at the ambient surrounding temperature.  It is either that or random instrument error.  Either way, the result is the same: rogue outliers are being captured as maxima and minima.

How often does this happen?

Over one week I collected 200 instances where the High Temps and Low Temps could be directly checked as they occurred in the same minute as the 30 minute observation.

The results are astounding.  The differences occurring in readings in the same minute are scattered across the range of temperatures.  Most High Temp discrepancies are of 0.1 or 0.2 degrees, but there is a significant number (39% of the sample) with 0.3C to 0.5C decreases in less than one minute, and five much larger.

Fig. 6:  Temperature change within one minute from maximum

Count diffs hi T graph

Notice that 95% of the differences were from 0.1C to 0.5C, which suggests that one minute ranges of up to 0.5C are common and expected, while values above this are true outliers.  The Bureau claims (see below) that in 90% of cases AWS probes have a tolerance of +/-0.2C, whereas the 2011 Review Panel mentioned the “the present +/- 0.5 °C”.  Is the tolerance really +/-0.5C?

Fig. 7:  Temperature change within one minute from minimum

Count diffs lo T graph

There was one instance where there was no difference.  The vast majority have a -0.1C difference, which is within the instruments’ tolerance.

This next plot shows the differences (temperature falls in one minute from the second with the highest reading to that of the final second) ordered from greatest to least.

Fig. 8:  Ordered count of temperature falls

Count diffs hi T

The few outliers are obvious.  More than half the differences are of 0.1C or 0.2C.

One minute temperature rises:

Fig. 9:  Ordered count of temperature rises

Count diffs lo T

Note the outlier at -2.1C: that was Hervey Bay Airport.  Also note only one example with no difference, and the majority at -0.1C.

Is there any pattern to them? 

The minimum temperature usually occurs around sunrise, although in summer this varies, but very rarely when the sun is high in the sky.  Therefore rapid temperature rise at this time will be relatively small, as the analysis shows: 80% of the differences between the Low Temps and corresponding final second observations were zero or one tenth of a degree, and 91% two tenths of a degree or less.  As the instrument tolerance of AWS sensors is supposed to be +/- 0.2C, the vast majority of Low Temps are within this range.  Therefore, the Low Temps are not significantly different from the Latest Observation figures.  Yet as it is the Lowest Temperature that is being recorded, all but one example have the Low Temp, and therefore daily minimum, cooler than the final second observation.  9% are outside the +/-0.2C range and show real discrepancy, i.e. very rapid temperature rise within one minute, that is worth investigating.  Remember, the fastest morning rise I’ve found averaged about 0.2C per minute.

The High Temps have 56% of discrepancies within the +/-0.2C tolerance range.  Day time temperatures are much more subject to rapid rise and fall of temperatures.  The 44% of discrepancies of 0.3C or more are worth investigation.  Many are likely due to small localized air temperature changes, the AWS probes being very sensitive to this, but the rapid decreases shown in the examples above, as well as the rapid rises in the Low Temp examples, mean that random noise is likely to be a factor as well.

Have they affected climate analysis? 

Comparison of values at identical times has shown that out of 200 cases, all but one had higher or lower temperatures at some previous second than at the last second of that minute, with a significant number of High Temp observations (39% of the sample) with 0.3C to 0.5C decreases in less than one minute, and five much larger.  There is a very high probability that similar differences occur at every station in every state, every day.

In more than half of the sample of High Temps, and over 90% of the Low Temps, the discrepancy was within the stated instrumental tolerance range, and therefore the values are not significantly different, but the higher or lower reading becomes the maximum or minimum, with no tolerance range publicised.

This would of course be an advantage if greater extremes were being looked for.

Nearly 10 percent of minimum temperatures were followed by a rise of more than 0.2C, and 44 percent of maxima were followed by a fall of more than 0.2C.  While many of these may have entirely natural causes, none of the very large discrepancies examined had an identifiable meteorological cause.   It is questionable whether mercury-in-glass or alcohol-in-glass thermometers used in the past would have responded as rapidly as this.  This must make claims for record temperatures questionable at best.

If you think that the +/- 0.2C tolerance makes no difference in the big picture, as positives will balance negatives and errors will resolve to a net of zero, think again.  Maximum temperature is the High Temp value for the day, and 44% of the discrepancies were more than +0.2C.  If random instrument error is the problem causing the apparent temperature spikes, (and downwards spikes in the hot part of the day are not reported unless they show up in the final second of the 30 minute reporting period), only the highest upwards spike, with or without positive error, is reported.  Negative error can never balance any positive error.

Further, these very precise but questionable values then become part of the climate monitoring system, either directly if they are for ACORN stations, or indirectly if they are used to homogenise “neighbouring” ACORN stations. They also contribute to temperature maps, showing for example how hot New South Wales was in summer.

Again, temperature datasets in the ACORN network are developed from historic, not very precise, but (we hope) fairly accurate data from slow response mercury-in-glass or alcohol-in-glass thermometers observed by humans, merged with very precise but possibly unreliable, rapid response, one second data from Automatic Weather Systems.  The extra precision means that temperatures measured by AWS probes are likely to be some tenths of a degree higher or lower than LIG thermometers in similar conditions, and the higher proportion of High Temp differences shown above, relative to Low Temp differences, will lead to higher maxima and means in the AWS era.  Let’s consider maxima trends:

Fig. 10:  Australian maxima 1910-2016

graph max trend

There are no error bars in any BOM graph.  Maxima across Australia as a whole have increased by about 0.9 C per 100 years according to the Bureau, based on analysis of ACORN data.  Even if across the whole network of 526 automatic stations the instrument error is limited to +/- 0.2C, that is 22.2% of the claimed temperature trend.  In the past, indeed as recently as 2011 (see below), instrument error was as high as +/-0.5C, or about half of the 107 year temperature increase.  No wonder the Bureau refuses to show error bands in its climate analyses.

There have been NO comparison studies published of AWS probes and LIG thermometers side by side.  Can temperatures recorded in the past from liquid-in-glass thermometers really be compared with AWS one second data?  The following quotes are from 2011, when an Independent Review Panel gave its assessment of ACORN before its introduction.

Report of the Independent Peer Review Panel p8 (2011)

Recommendations: The Review Panel recommends that the Bureau of Meteorology should implement the following actions:

A1 Reduce the formal inspection tolerance on ACORN-SAT temperature sensors significantly below the present ±0.5 °C. This future tolerance range should be an achievable value determined by the Bureau’s Observation Program, and should be no greater than the ±0.2 °C encouraged by the World Meteorological Organization.

A2 Analyse and document the likely influence if any of the historical ±0.5 °C inspection tolerance in temperature sensors, on the uncertainty range in both individual station and national multidecadal temperature trends calculated from the ACORN-SAT temperature series.

And the BoM Response: (2012)

… … …   An analysis of the results of existing instrument tolerance checks was also carried out. This found that tolerance checks, which are carried out six-monthly at most ACORN-SAT stations, were within 0.2 °C in 90% of cases for automatic temperature probes, 99% of cases for mercury maximum thermometers and 96% of cases for alcohol minimum thermometers.

These results give us a high level of confidence that measurement errors of sufficient size to have a material effect on data over a period of months or longer are rare.

This confirms LIG thermometers have more reliable accuracy than automatic probes, and that 10% of AWS probes are not sufficiently accurate, with higher error rates.  That is, at more than 50 sites.  If they are in remote areas, their inaccuracy will have an additional large effect on the climate signal.   It is to be hoped that Alice Springs, which contributes 7-10% of the national climate signal, is not one of them.

Conclusion

It is very likely that the 199 one minute differences found in a sample of 200 high and low temperature reports are also occurring every day at every weather station across Australia.  It is very likely that nearly half of the High Temp cases will differ by more than 0.2 degree Celsius.

Maxima and minima reported by modern temperature probes are likely to be some tenths of a degree higher or lower than those reported historically using Liquid-In-Glass thermometers.

Daily maximum and minimum temperatures reported at Climate Data Online are just noise, and cannot be used to determine record high or low temperatures.

These problems are affecting climate analyses directly if they are at ACORN sites, or indirectly, if they are used to homogenise ACORN sites, and may distort regional temperature maps.

Instrument error may account for between 22% and 55% of the national trend for maxima.

A Wish List of Recommendations (never likely to be adopted):

That the more than 50 sites at which AWS probes are not accurate to +/- 0.2 degree Celsius be identified and replaced with accurate probes as a matter of urgency.

That the Bureau show error bars on all of its products, in particular temperature maps and time series, as well as calculations of temperature trends.

That the Bureau of Meteorology recode its existing three criteria filter, to zero-out spurious spikes and preferably send them as fault flags into a separate file in order to improve Quality Control.

That the Bureau replace its one second spot maxima and minima  reports with a method similar to wind speed reports: the average over 10 minutes.  That would be a much more realistic measure of temperature.

How Temperature Is “Measured” in Australia: Part 1

March 1, 2017

By Ken Stewart, ably assisted by Chris Gillham, Phillip Goode, Ian Hill, Lance Pidgeon, Bill Johnston, Geoff Sherrington, Bob Fernley-Jones, and Anthony Cox.

The Bureau of Meteorology maintains the Southern Oscillation Index (SOI), one of the most useful climate and weather records in the world.  In About SOI,  the Bureau says:

 Daily or weekly values of the SOI do not convey much in the way of useful information about the current state of the climate, and accordingly the Bureau of Meteorology does not issue them. Daily values in particular can fluctuate markedly because of daily weather patterns, and should not be used for climate purposes.

It is a pity that the BOM doesn’t follow this approach with temperature, and in fact goes to the opposite extreme.

Record temperatures, maximum and minimum temperatures, and monthly, seasonal, and annual analyses are based not on daily values but on ONE SECOND VALUES.

The Bureau reports daily maximum and minimum temperatures at Climate Data Online,   but also gives a daily summary for each site in more detail on the State summary observations page , and a continuous 72 hour record of 30 minute observations (examples below), issued every 30 minutes, with the page automatically refreshed every 10 minutes, also handily graphed .  These last two pages have the previous 72 hours of readings, after which they disappear for good.  However, the State summary page, also refreshed every 10 minutes, is for the current calendar day only.

This screenshot shows part of the Queensland observations page for February 26, showing the stations in the North Tropical Coast and Tablelands district.

Fig. 1:  District summary page

mareeba-example

Note especially the High Temp of 30.5C at 01:26pm.  Clicking on the station name at the left takes us to the Latest Weather Observations for Mareeba page:

Fig. 2:  Latest Observations for Mareeba

mareeba detail example.jpg

Notice that temperature recordings are shown every 30 minutes, on the hour and half hour.

In Figure 1 I have circled the Low Temp and High Temp for Mareeba.  Except in unusual circumstances, High Temp and Low Temp values become the maximum and minimum temperatures and are listed on the Climate Data Online page, and for stations that are part of the ACORN network, become part of the official climate record.  It is most important that these High Temp and Low Temp values, the highest and lowest recorded temperatures of each day, should be accurate and trustworthy.

But frequently they are higher or lower than the half hourly observations, as in the Mareeba example (0.6C higher), and I wanted to know why.  In this post I show some recent examples, with the explanation from the Bureau.

Perhaps the difference between the Latest Weather Observations and maximum temperature reported at Climate Data Online is due to brief spikes in temperature in between the reported temperatures of the latest observations, such as in this example from Amberley RAAF on February 12.

Fig. 3:  Amberley RAAF temperatures, 12 February 2017

amberley-12-feb

A probable cause would be that the Automatic Weather Station probe is extremely sensitive to sudden changes in temperature as breezes blow warmer or cooler air around or a cloud passes over the sun.

However, this may not be the whole story.

Occasionally the report time for the High Temp or Low Temp is exactly on the hour or half hour, and therefore can be directly compared with the temperature shown for that time at the station’s page.

These progressive Low and/or High Temps on the half hour or hour occur and can be observed throughout the day at various times, as well as at the end of the reporting period.

For example, here is a mid-afternoon screenshot of the Queensland- Wide Bay and Burnett district summary for Wednesday 15th February.  I have highlighted the High Temp value for Maryborough at 1:00pm.

Fig. 4:  District summary at 2:00pm for Maryborough 15 February 2017

obs-mboro-15th

In the Latest Observations for Maryborough, I have highlighted the 1:00pm reading.

Fig. 5: Latest Observations at Maryborough at 01:00pm on 15 February

obs-mboro-15th-detail

The difference is +1.5 degrees.  Here I have graphed the results.

Fig. 6:  Maryborough 15 February

mboro-15th-graph

That’s a 1.5 degree difference at the exact same minute.

Here is a screenshot of Latest Observations values at Hervey Bay Airport on Wednesday 22 February.  Low Temp for the morning of 23.2C was reached at 6.00 a.m.

Fig. 7:  Hervey Bay, 06:00am  22 February 2017

hervey-bay-22nd

Note that at 6.00am, just after sunrise, the Latest Observations page shows that the temperature was 25.3 degrees.  The daily Low Temp was reported as 23.2 degrees at 6.00am – 2.1 degrees cooler.  This graph will show the discrepancy more plainly.

Fig. 8:  Hervey Bay temperatures 22 February

hervey-bay-22nd-graph

What possible influence would cause a dawn temperature to drop 2.1 degrees?

I sent a query to the Bureau about Hervey Bay, and the explanation from the Bureau’s officer was enlightening:

Firstly, we receive AWS data every minute. There are 3 temperature values:
1. Most recent one second measurement
2. Highest one second measurement (for the previous 60 secs)
3. Lowest one second measurement (for the previous 60 secs)

Relating this to the 30 minute observations page: For an observation taken at 0600, the values are for the one minute 0559-0600.

I’ve looked at the data for Hervey Bay at 0600 on the 22nd February.
25.3, 25.4, 23.2 .

The temperature reported each half hour on the station Latest Observations page is the instantaneous temperature at that exact second, in this case 06:00:00, and the High Temp or Low Temp for the day is the highest or lowest one second temperature out of every minute for the whole day so far.  There is no filtering or averaging.

The explanation for the large discrepancy was that “Sometimes the initial heating from the sun causes cooler air closer to the ground to mix up to the temperature probe (1.2m above ground).”

However, in Figure 7 above it can be seen that the wind was south east at 17 km/hr, gusting to 26 km/hr, and had been like that all night, over flat ground at the airport, so an unmixed cooler surface layer mixing up to the probe seems very unlikely.

You will also note that the temperatures in the final second of every half hour period from 12.30 to 6.30 ranged from 25C to 25.5C, yet in some second in the final minute before 6.00 a.m. it was at 23.2C.  I have shown these values in the graph below.

Fig. 9:  Hervey Bay 05:59 to 06:00am

hervey-bay-22nd-at-6am

The orange row shows the highest temperature for this last minute at 25.4C at some unknown second, the blue row the lowest temperature for this minute (and for the morning) at 23.2C at some unknown second, and the spot temperature of 25.3C at exactly 06:00:00am.  The black lines show the upper and lower values of half hourly readings between 12:30 and 06:30: the high temp and 06:00am readings are within this range.

23.2C looks a lot like instrument error, and not subject to any filtering.

Further, there are only two possibilities:  either from a low of 23.2C, the temperature rose 2.2 degrees to 25.4C, then down to 25.3C; or else from a high of 25.4C it fell 2.2 degrees to 23.2C, then rose 2.1 degrees to 25.3C, all in the 60 seconds or less prior to 06:00:00 a.m.

How often does random instrument error affect the High and Low Temps reported at the other 526 stations?  Like Thargomindah, where on February 12 the High Temp was 2.3 degrees to 2.5 degrees higher than the temperatures 15 minutes before and after?

Fig. 10:  Thargomindah temperatures 12 February 2017

thargomindah-12-feb

Or was this due to a sudden rise and fall caused by a puff of wind, even a whirl-wind?

Who knows?  The Bureau certainly doesn’t.

 

In Part 2, I will look at patterns arising from analysis of 200 High and Low Temps occurring in the same minute as the half hourly values, and implications this has for our climate record.

Another ABC Fail

February 5, 2017

Viewers of ABC-TV news, and followers of ABC News Online, were treated to a story on Friday night about “Turtle hatchlings dying in extreme heat at Mon Repos”, as it was headlined at ABC News Online:

Piles of dead turtle hatchlings are lining Queensland’s famous Mon Repos beach amid a heatwave which has pushed the sand’s temperature to a record 75 degrees Celsius.

While the majority of hatchlings break free from their nests at night when the sand is cooler, those escaping in the day face overheating.

“They can’t sweat, they can’t pant, so they’ve got no mechanism for cooling,” Department of Environment and Heritage Protection chief scientist Dr Col Limpus said.

….

The extreme heat is also conducted down to the turtle’s nest, pushing the temperature to about 34C, which is approaching the lethal level for incubation.

That is the hottest temperature recorded in a nest in more than a decade.

A record 75 degrees sand temperature? Hottest nest temperature in more than a decade?

Time for a reality check.

I have no data on temperatures inside turtle nests, but I do have data on temperature at nearby Bundaberg Aero (Hinkler Airport), which is an ACORN site.

Using monthly Acorn data, here is a plot of all January maxima at Bundy.

bundy-jan-max

January’s mean maximum of 31.6 degrees C was equalled or exceeded in 1924, 1931, 1969, 1998, 2002, 2006, 2013, and 2014.  While monthly mean doesn’t tell us about individual days, it does give us a clue about daily temperatures in hot years.  For that I also use ACORN daily data- adjusted, homogenised, and world’s best practice apparently.

How do temperatures at this time of year compare with those of previous years?  The next figures show data for the first 45 days of every year, that is from January 1 to February 14.

bundy-jan-max-daily-45

The past three weeks at Bundaberg have been at the high end of the range, but no records have been broken, and no days have been even close to 35C.  What about previous years?  The next plot shows the number of consecutive days above 35 degrees: very likely to raise sand temperature above what it has been this year.

bundy-jan-max-daily-45-over-35

No days this year above 35C, but at least 27 occasions in previous years of single days reaching 35C, at least 6 of 2 days in a row, and one of 3 days in a row above 35C.

A 7 day running mean will show whether temperatures have been consistently high.

bundy-jan-max-7d-av-45

As you can see 2017 is high but not extreme.  2002 had a 7 day average just under 35C.

This graph plots temperatures of the first 45 days of years with similarly hot January temperatures.  2017 is the thick black line.

bundy-jan-max-daily-45-hot-yrs

On one day- January 20- 2017 was hotter than the other years.  Note how in several years the temperature drops to the mid 20s when heavy rain falls.  Note also the temperature reached the high 30s in February 2002.

The final graph shows the 7 day average of the same period of similarly hot years.

bundy-jan-max-7d-av-45-hot-yrs

Several previous periods were hotter than so far this year.

Once again we see misleading claims being made and reported by the ABC as gospel, without any attempt at fact checking.  A simple check shows that, while it may be true that the reported temperatures are the hottest recorded by these researchers, it is extremely unlikely that these were as high as they were in past years.  On every count- daily, monthly mean, 7 day mean, consecutive hot days- it can be shown that this year, while hot, is not as hot as many previously, and it follows that sand temperatures would similarly have been hotter in the past.

And that’s without considering the Holocene Optimum and the Eemian.

Another ABC fail.

Dig and Delve Part III: Temperate Regions

February 1, 2017

In this post I draw together ideas developed in previous posts- Poles Apart, Pause Updates, Dig and Delve Parts I and II– in which I lamented the lack of tropospheric data for the regions of the northern and southern hemispheres from 20 to 60 degrees North and South.  These regions between the Tropics and Polar regions I shall call Temperate regions, as that’s what I was taught in school.

A commenter of long standing, MikeR, who has always endeavoured to keep me on the straight and narrow, suggested a method of estimating temperature data for these regions using existing Polar and Extra-Tropical data.  I’ve finally got around to checking, and can now present the results.

The correct formula is:

T (20 to 60 degrees) = 1.256 x TexT ( 20 to 90 degrees) – 0.256 X T pole(60 to 90 degrees).

This gives an approximation for these regions in lieu of UAH data specifically for them.

And the results are very, very interesting.  Hello again, Pause.

All data are from the University of Alabama (Huntsville) (UAH) lower troposphere, V.6.0.

First of all, here are plots showing the Extra-Tropics (20-90), compared with  the corresponding Temperate regions (20-60).

Fig. 1:  Monthly UAH data for Northern Extra-Tropics (20-90N) and Estimate for Northern Temperate Region (20-60N)

 nth-temp-v-next

Fig. 2:  Monthly UAH data for Southern Extra-Tropics (20-90S) and Estimate for Southern Temperate Region (20-60S)

sth-temp-v-sext

As expected, the result of very slight differences is a slight cooling of the Northern Extra Tropics trend, and a slight warming for the Southern.   No surprise there.

The real surprise is in the Land and Ocean data.  In the Northern Temperate region, CuSum analysis reveals a large regime change which occurred at the beginning of 1998.  The following plots show trends in the data up to January 1998 and from February 1998 to December 2016.

Fig. 3: Estimated Northern Temperate data trends to January 1998 and from February 1998 to December 2016.

nth-temp-2-trends

Fig. 4: Estimated Northern Temperate data trends to January 1998 and from February 1998 to December 2016: Ocean areas.

nth-temp-2-trends-ocean

Fig. 5: Estimated Northern Temperate data trends to January 1998 and from February 1998 to December 2016: Land areas.

nth-temp-2-trends-land

Say hello to the Pause again.  Northern Temperate land areas- most of North America, Asia, Europe, and North Africa, containing the bulk of the world’s population, agriculture, industry, and CO2 emissions- has had zero trend for 18 years and 11 months.  While the trend for the whole record is +1.8C per 100 years, the record is clearly made of two halves, the first with a much milder +0.7C trend, then after an abrupt step change, the second half is flat- in spite of the “super El Nino” and the “hottest year ever”.

Compare this with the Extra-Tropics data, 20-90N.

Fig. 6: Northern Extra-Tropics data (20-90N) trends to January 1998 and from February 1998 to December 2016: Land areas.

next-land-2-trends

The step change is still there, but the trends are virtually unchanged- only 0.1C different +/- 0.1C.

Why the difference?  Northern Extra Tropics data (20-90N) includes the North Polar data (60-90N).  The major change in the North Polar region occurred in early 1995, as the next two figures show:

Fig. 7: Northern Polar data (60-90N) trends to February 1995 and from March 1995 to December 2016: Land areas.

np-land-2-trends

Fig. 8: Northern Polar data (60-90N) trends to February 1995 and from March 1995 to December 2016: Ocean areas.

np-ocean-2-trends

Massive changes in trend.  Note the change apparently occurred in land data before ocean, which is peculiar, and both in the dead of winter.  Polar regions, though much smaller, have a large impact on trends for the Extra-Tropics.

In the Southern part of the globe, once again say hello to the Pause.

Fig. 9: Estimated Southern Temperate data trends to January 1998 and from February 1998 to December 2016.

sth-temp-2-trends

While the step change is much smaller, using the same dates the Pause is still undeniable.

Fig. 10: Estimated Southern Temperate data trends to January 1998 and from February 1998 to December 2016- Land areas.

sth-temp-2-trends-land

Fig. 11: Estimated Southern Temperate data trends to January 1998 and from February 1998 to December 2016- Ocean areas.

sth-temp-2-trends-ocean

Most of the Southern Hemisphere is ocean, so it follows that a Pause in the ocean leads to a Pause overall.

It is important to stress that the figures I show for Northern and Southern Temperate regions are estimates, not actual data from UAH.  However, they are pretty good estimates, and until we have data from UAH, the best available.

Of the world’s regions, South Polar and Southern Temperate regions are paused, as is the Northern Temperate Land region, which is arguably the most important.  The Tropics fluctuate with ENSO.  Only the Arctic is strongly warming.

The Temperate regions are arguably the most important of the globe.  Together they cover more than half the surface area, and contain the bulk of the world’s population, agriculture, industry, and emissions.  I hope that Dr Spencer will be able to provide datasets for these regions as soon as possible.

Putting Temperature in Context: Pt 2

December 14, 2016

To show how handy my Excel worksheet is, here’s one I did in the last 15 minutes.

Apparently Sydney has had its warmest December minimum on record at 27.1 C.  The record before that was Christmas Day, 1868 at 26.3C.

The following seven plots show this in context.

Fig. 1:  The annual range in Sydney’s minima:

whole-yr-sydney-min

Extremes in minima can occur any time between October and March.

Fig. 2:  The first 2 weeks of December

14d-sydney-min

Plainly, a new record was set this morning, but apart from Day 340 the other days are within the normal range.

Fig. 3:  7 day mean of Tmin in this period

7d-avg-sydney-min

Extreme, but a number of previous years had warmer averages.

Fig. 4:  Consecutive days above 20C Tmin.

days-over-20-sydney

But there have been longer periods of warm minima in the past.

Now let’s look at the same metric, but for all of December.

Fig. 5:  All Decembers (including leap years).

december-sydney-min

A record for December, with 1868 in second place.

Fig. 6:  7 day mean of Tmin for Decembers

7d-avg-sydney-min-december

Seven day periods of warm nights are not new.  The horizontal black line shows the average to this morning (20.6C) is matched or exceeded by a dozen other Decembers.  (Of course this December isn’t half way through yet.)  Also note what appears to be a step change about 1970.

Fig. 7:  Consecutive days above 20C Tmin in December.

days-over-20-sydney-december

I doubt if 15 December will be as warm as today, but could still be over 20C.

This is weather, not global warming.

 

Putting Daily Temperature in Context

December 14, 2016

In this post I demonstrate a simple way of comparing current temperatures for a particular location with those previously recorded.  In this way it is possible to show the climatic context.

Using data from Climate Data Online, I plot maximum temperature for each day of the year, and then for a particular short period: in this case the last week of November and the first week of December, which coincides with the recent very warm spell here in Queensland.  To account for leap and ordinary years this period is 15 days.  In ordinary years 24th November is Day 328 and 7th December is Day 341, while in leap years this same calendar period is Day 329 to 342.  I also calculate the running 7 day mean TMax for this period, and the number of consecutive days above 35C.

To put the recent heatwave in context, I have chosen six locations from Central and Southern Queensland which regularly feature on ABC-TV weather: Birdsville, Charleville, Roma, Longreach, Ipswich (Amberley RAAF), and Rockhampton.

Birdsville:

Fig. 1

whole-yr-birdsville

The Police Station data are from 1954 to 2005, and the Airport from 2000.  This shows the range of temperatures throughout the year.  The red arrow indicates the current period.   The next plot shows data only for the period in question.

Fig. 2:  24 November- 7 December: Airport data

14d-comp-birdsville-air

Note there were three days where the temperature this year was the highest for those days since 2000, but didn’t exceed the highest in this time period, which was in November.  The other days were well within the historic range.

For interest, let’s now see how this year compares with the Police Station record.  (The average difference in TMax during the overlap period was 0.0 to 0.3C.)

Fig. 3:  24 November- 7 December: Police Station data

14d-comp-birdsville-police

In a similar range.

Fig. 4

7d-avg-birdsville

This heatwave was the third hottest since 2000 and fifth overall.

Fig. 5

days-over-35-birdsville-air

Five previous periods had more consecutive days above 35C.  2006 had 22.

Charleville:

Fig. 6: Charleville Aero since 1942

whole-yr-charleville-aero

Temperatures in this period reached the extremes of the range on three days.

(Although the Post Office record begins in 1889, there are too many errors in the overlap period so the two records can’t be compared.)

Fig. 7:

14d-charleville-aero

A new record for early December was set, but note this was the same temperature as 29th November 2006.

Fig. 8:

7d-avg-charleville-aero

Definitely the hottest for this period since 1942.

Fig. 9:

days-over-35-charleville-aero

Note this was not the longest warm spell by a mile: there were many previous periods with up to 26 consecutive days above 35C.

Roma

Fig. 10:

whole-yr-roma

Although there is not one day of overlap so the two records can’t be compared, you can see that Airport (from 1992) and Post Office records are similar.

Fig. 11:

14d-comp-roma-air

A new record for this time of year was set: 44.4C, and six days in a row above 40C.  Pretty hot….

Fig. 12:

days-over-35-roma-air

…but there were longer hot periods in the past (since 1992).

Longreach

Fig. 13:  Longreach Aero since 1966.

whole-yr-longreach-aero

Fig. 14:

14d-longreach-aero

Hot, but no record.

Although there is good overlap with the Post Office, temperatures for this period differ too much: from -1 to +0.7C.

Fig. 15:

7d-avg-longreach-aero

Fifth hottest period since 1966.

Fig. 16:

days-over-35-longreach-aero

And in the past there have been up to 47 consecutive days above 35C at this time of year.

Ipswich (Amberley RAAF):

Fig. 17:

whole-yr-amberley

Fig. 18:

14d-amberley

Not unusually hot for this time of year.

Fig. 19:

7d-avg-amberley

Ninth hottest since 1941.

Fig. 20:

days-over-35-amberley

Hotter for longer in the past.

Rockhampton:

Fig. 21:

whole-yr-rocky

Fig. 22:

14d-rocky-air

Very hot, but no records.  (The heat lasted another two days, with 36.6 and 37.3 on 8th and 9th.)

Fig. 23:

7d-avg-rocky

Fourth hottest 7 day average on record (since 1939).

Fig. 24:

days-over-35-rocky-air

Again, a number of hot days, but there were as many and more in the past.

To conclude: the recent heatwave was very hot certainly, and was extreme in southern inland Queensland.  While Charleville had the highest seven day mean temperature on record, NO location had as many consecutive hot days (above 35C) as in the past.

This is a handy method for showing daily data in context.  It can used for any period of the year, can be tuned to suit (I chose TMax above 35C, but temperatures below a set figure could be found), and can be used for any daily data.

If you would like a comparison done for a location that interests you, let me know in comments including time period and parameters of interest (e.g. Sydney, first 2 weeks of December, TMax above 30C say, or Wangaratta, September, daily rainfall over 10mm say.)