Posts Tagged ‘Bureau of Meteorology’

ACORN-SAT 2.0: The Northern Territory- Alice in Wonderland

February 15, 2019

(UPDATE 17/02/2019:

I have corrected a glitch in trend calculations which are now as shown.  I have deleted all Diurnal Temperature Range plots and discussion as well.)

This is the second in a series of posts in which I directly compare the most recent version of Australia’s temperature record, ACORN-SAT 2, with that of the previous version, ACORN-SAT 1.  Daily data are directly downloaded from the Bureau of Meteorology. I do not analyse against raw data (available at Climate Data Online), except for particular examples, as I am interested in how different Acorn 2 is from Acorn 1.  The basis for the new version is in the Research Report.

See my previous post for Western Australia for a general introduction.

The Context – The Northern Territory

Figure 1 is a map of Australia showing all of the Bureau’s ACORN-SAT climate monitoring stations.  The Northern Territory is right in the Outback, from the monsoonal north to the desert centre. Most of it is savannah or desert, and there are vast distances between settlements and thermometers.

Figure 1:  Australian ACORN-SAT stations

map NT

There are five Acorn stations in the Northern Territory BOM database.  Differences between Acorn 1 and Acorn 2 are summarized in the following sections.

Trend changes

Trends in maximum temperature have changed a lot at individual stations, but on average there has been little change  (+1.29C to +1.27C per 100 years).  (Even though an average of such wildly different stations across such vast territory is meaningless.)

Figure 2:  Maxima trend changes from Acorn 1 to Acorn 2

NT max trend

The “average” change in minima is -33.3%  (+0.55C to +0.37C per 100 years).    This however is mainly due to Rabbit Flat’s short history with much missing data.

Figure 3:  Minima trend changes from Acorn 1 to Acorn 2

NT min trend

Largest temperature differences

In maxima, changes to Acorn 1 daily data were mostly small, except at Alice Springs which had adjustments ranging from -9.2C to +10.1C applied to individual daily figures, but only on a few days.  The +10.1C adjustment was to correct what could only have been a typographical error in Acorn 1, which recorded 26.8C instead of 36.8C on 28 January 1944.  The -9.2C is less easily explained and may be the opposite, Acorn 2 recording 24.1C instead perhaps of 34.1C on 6 March 1943.  Acorn 2 made many other large corrections around these dates, as Figure 4 shows.

Figure 4:  Daily changes in maxima from Acorn 1 to Acorn 2 at Alice Springs

max diff alice

Minima adjustments ranged from -11.5C to +11C also at Alice, and there were many other large adjustments as well.  At the other stations the range was much less, though still substantial changes (-3.6C to +4.6C) to Acorn 1.  Here is Alice Springs again:

Figure 5:  Daily changes in minima from Acorn 1 to Acorn 2 at Alice Springs

min diff alice

(Remember, these are adjustments to Acorn 1, which was supposed to be “world’s best practice” seven years ago.  How did Blair Trewin get it so wrong the first time?  Has world’s best practice changed so much in seven years?)

Record temperatures

A new record maximum was established at Darwin, whose record on 18 October 1982 (unchanged from raw to Acorn 1) increased from 38.9C to 39.5C in Acorn 2.

Figure 6:  Three versions of maxima at Darwin 18 October 1982

Darwin max 1982

A slightly higher record was also set at Victoria River Downs.

A new record low temperature on 21 June 1925 was also established at Alice Springs, where the Acorn 1 temperature of -6.7C was reduced to -9.4C.   (The temperature in the Post Office raw data was -5.6C.)  New lows were established at Darwin and Tennant Creek as well, but on nothing like the same scale.

Apparently the adjustments made to raw data in Acorn 1 weren’t big enough.

Quality Control: especially minimum temperatures higher than maximum.

In Acorn 1, 3 out of the 5 stations had at least one example of minimum higher than maximum.  Blair Trewin claims he has “fixed” this problem (which he concedes was “physically unrealistic”) by adjusting temperatures in Acorn 2 so that the maximum and minimum are the same, so that DTR for the day is zero.  In his words:

A procedure was therefore adopted under which, if a day had a negative diurnal range in the adjusted data, the maximum and minimum temperatures were each corrected to the mean of the original adjusted maximum and adjusted minimum, creating no change in the daily mean.

But that is not how he “corrected” the worst NT examples in Acorn 1 (minimum 4.8C above maximum at Alice Springs, and a 3.9C difference at Tennant Creek).  Here is a plot of the raw data and changes made by Acorn 1 and Acorn 2 at Alice Springs for 11 to 21 June 1932.

Figure 7:  Alice Springs Post Office data for 11-21 June 1932

Alice june 32 min2

Acorn 1 made no change to raw maxima, but was supposed to cool raw minima (the purple line) substantially  (the blue line).  Unfortunately, it is likely that instead of 8.1C, 18.1C was entered, human error resulting in garbage.  Acorn 2 has fixed this, but not by making minima and maxima equal to the Acorn 1 mean (15.7C), and neither is the DTR zero.  Instead there were more arbitrary adjustments.

(At Tennant Creek, to correct negative DTR of -3.9C,  minimum and maximum were both set to 22.9C, which is one degree less than the Acorn 1 mean of 23.9C).

 “Square wave” pattern in adjustments

The peculiar repeating pattern of adjustments to Perth in Acorn 1 also occurs at Darwin, but the pattern is even more bizarre.

Figure 8:  Darwin Acorn 1 daily maxima differences (pre-World War 2)

sq wave Darwin acorn 1

In every month, every day of the month was adjusted in Acorn 1 by exactly the same amount, which is the reason only 1917 is visible- the others are exactly the same.  Blair Trewin has taken notice of the criticism, and adjusted Acorn 2 with a little more intelligence, but the monthly pattern is still visible.  Adjustments are still applied month by month, especially in the Dry months.

Figure 9:  Darwin Acorn 2 daily maxima differences 

sq wave Darwin acorn 2

Conclusion:

There are no additional stations, so the network is still extremely sparse.

There is a very small amount of additional digitized data.

The average trend in maxima for NT has not changed very much, even though there is a large range across individual stations.  There was a reduction in the minima trend of -33.3%, mainly from the large impact of Rabbit Flat’s poor data.

Alice Springs had large differences between Acorn 1 and Acorn 2 daily data of over 11 degrees Celsius.

New record maximum and minimum temperatures have been set.

The issue of instances of minima being higher than maxima caused by too vigorous adjustments or human error has been “fixed” by arbitrary adjustments, and not as described in the research paper.

The bizarre “square wave” pattern in adjustments in Darwin has been largely rectified, at least in the Wet months.

With only five Acorn stations in the Territory, each one has a large impact on the climate record.  Alice Springs, which is said to contribute 7 to 10 percent of the national climate signal, has had extremely large adjustments made to Acorn 1.  VRD and Rabbit Flat, stations with short histories and incomplete data, also have a large impact on the national climate signal.

The size of the adjustments (made by comparison with stations up to 1,300 km away) only seven years after the “world’s best practice” dataset was launched, is incredible, and demands explanation.

Otherwise, it would appear that the temperature record of the Northern Territory, especially at The Alice,but also at other stations, has fallen down a rabbit hole, and appears to be out of a chapter from Alice in Wonderland.

Next: Queensland.

 

Advertisements

How Reliable is the Bureau’s Heatwave Service?

January 24, 2019

The Bureau of Meteorology presents heatwave assessments and forecasts in the interest of public health and safety.  Their heatwave definition is not based on any arbitrary absolute temperature, but uses a straightforward algorithm to calculate “excess heat factors”.  From their FAQs:

“Heatwaves are calculated using the forecast maximum and minimum temperatures over the next three days, comparing this to actual temperatures over the previous thirty days, and then comparing these same three days to the ‘normal’ temperatures expected for that particular location. Using this calculation takes into account people’s ability to adapt to the heat. For example, the same high temperature will be felt differently by residents in Perth compared to those in Hobart, who are not used to the higher range of temperatures experienced in Perth.

This means that in any one location, temperatures that meet the criteria for a heatwave at the end of summer will generally be hotter, than the temperatures that meet the criteria for a heatwave at the beginning of summer.

……

The bulk of heatwaves at each location are of low intensity, with most people expected to have adequate capacity to cope with this level of heat.”

Back in 2015 I showed how this algorithm works perfectly for Melbourne, but fails to detect heatwaves in Marble Bar and instead finds heatwaves at Mawson in the Antarctic.  In light of the long period of very hot weather across most of western Queensland, what does the Heatwave Service show?

Here is their assessment of conditions in Queensland over the last three days….

Fig. 1: Heatwave assessment for 21-23 January 2019

heatwave assessment

Most of inland Queensland has been in a “Low-Intensity Heatwave”, with a couple of small areas near the southern border of “Severe Heatwave”.

And here is their forecast for the next three days..

Fig. 2:  Heatwave forecast for 24-26 January 2019

heatwave forecast

Much the same, with a bit more Severe Heatwave coming.

So what were temperatures really like in the previous three days? Here’s the map for the middle of that period, Tuesday 22nd:

Fig. 3:  Maximum temperatures for 22 January

max 22 jan 1 day

About half the state was above 39 degrees C, a large area was above 42C, and there were smaller areas of above 45C.

And in the past week:

Fig. 4:  Maximum temperatures for 7 days to 23 January

max 22 jan 1 week

Average maxima for roughly the same areas were the same, except there was a larger area averaging over 45C!

This follows December when a large slab of the state averaged from 39C to 42C for the month.

Fig. 5:  Maximum temperatures for December 2018

max 22 jan 1 month

I’m focusing on Birdsville, circled on the map below (and indicated on the maps above.)

Fig. 6:  Queensland forecast towns- Birdsville indicated

qld map

Here are the maxima for Birdsville for January:

Fig. 7:  Birdsville Maxima for January

birdsville jan max

And here’s the forecast for the next 7 days:

Fig. 7:  Birdsville 7 Day Forecast

birdsville forecast

Apart from the 6th, when it was a cool 38.8C, since Christmas Eve the temperature has been above 40C every day, and is forecast to stay above 40C until next Tuesday (and above 45C until Sunday).  Minima have been above 25C on all but three days since Christmas.

And that’s a “Low Intensity” heatwave, with “most people expected to have adequate capacity to cope with this level of heat.”

The Bureau’s unspoken message?  It might be a bit hot, but you’re supposed to be used to it.  Harden up!

Western Queensland residents are pretty tough, but surely a month of such heat deserves a higher level of description than “Low Intensity”- especially for the vulnerable like babies, old people, and visitors.

This is worse than laughable.  The Bureau’s heatwave service is a crock.  As I said in my 2015 post, a methodology that fails to detect heatwaves at Marble Bar (or Birdsville!), and creates them in Antarctica, is worse than useless- it is dangerous.

Case Studies in “World’s Best Practice” 2: Kerang

November 5, 2015

Introduction:  This series of posts is intended to show that despite Greg Hunt’s loyalty, all is not right at the Bureau of Meteorology.

Please refer to my first post, Case Studies in “World’s Best Practice” 1:  Wilsons Promontory, for a complete description of the Bureau’s claims, the problems, data sources, and my methods.

Here are some further examples of “World’s Best Practice”.

********************

Kerang is on the Murray River, about 250 km from Melbourne.  The story of temperature adjustments here illustrates much that is wrong with the Bureau: misinformation, incompetence, lack of transparency, and unscientific behaviour.  This post took longer than expected because the more I looked, the more problems I found.

Note: Both maxima and minima at Kerang are warming. I have no comment on whether the adjustments are justified.  I am only interested in the methods used.

Problem 1: Missing data

The Bureau’s claim that they provide raw data as well as adjusted data is a half-truth, and completely misleading- some would say, dishonest.

The Bureau has adjusted Kerang maxima at 01/06/1957 and 01/01/1922, and minima at 18/01/2000 and 01/08/1932, and provides daily adjusted temperatures from 1/1/1910.

Unfortunately, there are NO daily raw data for Kerang before 1/1/1962.

Where are 52 years of daily temperatures?  How is it possible to have adjusted digitised data but no raw digitised data for half of the record?

Another issue brought to my attention is that there is an enormous amount of data missing even from Acorn: a large proportion every year before 1960, especially from 1932 to 1949, when 100 to 180 days are missing every year.

null days kerang

This lack of transparency makes it impossible to replicate and analyse the adjustments at Kerang.  If it can’t be replicated, with all data made available, it isn’t science.

Problem 2: Nonsense temperatures

There is only one instance when Acorn shows that the minimum temperature, the lowest temperature for the 24 hour period, was higher than the maximum temperature.

min max kerang

That dot at ‘0.6’ shows that on 2nd February 1950 the coldest temperature was 0.6C hotter than the hottest temperature!  Unfortunately it is impossible to compare with the missing raw data.

Any organisation that can’t perform a basic quality control test on its product is incompetent, as is any Review Panel or Technical Advisory Forum that endorses it.

Problem 3: Artificial warming 

Even though UHI makes Melbourne unsuitable for use in climate analysis, the Bureau still uses it to adjust the early data at Kerang!

Problem 4:  Neighbours

One of the neighbours used to adjust Kerang is Broken Hill, 477 km away, and another is Snowtown in South Australia, 565 km away.

Problem 5:  Results of adjustment

Comparison of differences between Kerang and its neighbours, pre- and post adjustment, using annual temperatures.

Firstly, minima, from the 2000 adjustment: Kerang minus neighbours, annual anomalies from 1985-2014.

Kerang comp 2000 min

The adjustment of -0.4C applied to years before 2000 is too great.  The slope of the mean difference from the neighbours is much too steep.

Next, for the 1932 adjustment (annual anomalies from 1917-1946 means):

Kerang comp 1932 min

Again, the adjustment is too great, as they make the differences from neighbours much greater.

The same pattern follows with maxima.  The 1957 adjustment (anomalies from 1944-1973):

Kerang comp 1957 max

And the 1922 adjustment (anomalies from 1910-1938):

Kerang comp 1922 max

In both cases Kerang is cooling compared with neighbours, but the adjustments reverse this and make Kerang compare less well with its neighbours.

Problem 6:  Undocumented adjustments

The Bureau lists only two adjustments to minima at Kerang:  -0.4 on 18/01/2000 and -0.61 on 01/08/1932.  This is not the whole story, as a plot of the actual annualised adjustments shows:

Kerang adjustments min

If the adjustments were as stated, the difference between adjusted and raw temperatures would be indicated by the blue lines.  The actual adjustments are shown by the brown lines.

The queried adjustments are not mentioned in the Bureau’s list here.

Similarly, there are two documented adjustments to maxima: -0.71 on 01/06/1957 and +0.33 on 01/01/1922.  These are visible in the next graph, but note the extra adjustment before 1950, and a series of adjustments from 1948 back to 1925.

Kerang adjustments max

I understand why these are needed: to adjust for the steadily increasing difference between Kerang and neighbours in this period.  But why was this not documented?

Thus we see at Kerang further misinformation and lack of transparency through failure to supply digitised raw data to allow replication; incompetence through not using basic checks for data integrity, resulting in publication of the “world’s best practice” temperature dataset with minimum temperatures higher than maximum; use of UHI contaminated sites when making adjustments; use of distant neighbours from different climate regimes; over-zealous adjustments resulting in worse comparison with neighbours than before; and undocumented adjustments.

Half-truths, incompetence, lack of transparency, and unscientific practices are evident at many other sites.  A proper investigation into the Bureau is overdue.

Case Studies in “World’s Best Practice” 1: Wilsons Promontory

October 26, 2015

Introduction: This series of posts is intended to show that despite Greg Hunt’s loyalty, all is not right at the Bureau of Meteorology.

The Bureau describes its methodology for creating the ACORN-SAT temperature reconstruction as “world’s best practice”, as it was described thus by the 2011 International Review Panel. The recent Report of the Technical Advisory Forum accepts this claim, reporting that “the Forum did not prioritise further international comparison of the Bureau’s curation methods in this report. However, the Forum will revisit this issue at its next meeting in 2016.”

In light of this endorsement, here are some examples of “World’s Best Practice”.
**********************************************************

Wilsons Promontory Lighthouse is on the southernmost tip of the Australian continent, about 170 km from Melbourne. The story of temperature adjustments here illustrates much that is wrong with the Bureau: misinformation, incompetence, lack of transparency, and unscientific behaviour.

Note: Both maxima and minima at Wilsons Promontory are warming. The Minima trend has been cooled, the maxima warmed.  I have no comment on whether the adjustments are justified. I am only interested in the methods used.

ACORN-SAT, (Australian Climate Observation Reference Network- Surface Air Temperatures), was introduced in March 2012, with several revisions mainly to bring the series up to date. It is a daily dataset of minima and maxima, from which monthly and annual means are derived, for 112 sites around Australia. Raw temperature data at these sites were homogenised by a complicated algorithm by comparison with neighbouring sites.

After much criticism, the Bureau has been forced to provide some answers, and agreed to ‘checking’ by a Technical Advisory Forum. The Bureau has provided additional information at the Acorn website, and in September 2014 released a list of the sites with adjustment dates, amounts, and the neighbour sites used for adjustment (see http://www.bom.gov.au/climate/change/acorn-sat/documents/ACORN-SAT-Station-adjustment-summary.pdf). Unfortunately, this additional information has raised more questions than it has unsuccessfully answered.

Problem 1: Missing data
The Bureau says at its FAQ No. 6 at http://www.bom.gov.au/climate/change/acorn-sat/#tabs=FAQs ,
the Bureau provides the public with raw, unadjusted temperature data for each station or site in the national climate database, as well as adjusted temperature data for 112 locations across Australia”, and at No. 8, “Daily digitised data are now available back to 1910 or earlier at 60 of the 112 ACORN-SAT locations, as well as at some non-ACORN-SAT locations.

This is a half-truth, and completely misleading- some would say, dishonest.

The Bureau provides raw data at Climate Data Online at http://www.bom.gov.au/climate/data/, and adjusted data at http://www.bom.gov.au/climate/change/acorn-sat/#tabs=Data-and-networks.

The Bureau has adjusted all Wilsons Promontory maxima before 1/1/1950, and minima before 1/1/1930, and provides daily adjusted temperatures from 1/1/1910.

Unfortunately, there are NO daily raw data for Wilsons Promontory before 1/1/1957.

Where are 47 years of daily temperatures? How is it possible to have adjusted digitised data but no raw digitised data?

Likewise, of the 10 neighbouring sites used for the pre-1950 maxima adjustments, only five have daily raw data before 1957, and for minima, only two (and one is Melbourne- more later). Were the adjustments made with only two comparisons? Otherwise, where are the data for the others?

This lack of transparency makes it impossible to replicate and analyse the adjustments at Wilsons Promontory. If it can’t be replicated, with all data made available, it isn’t science.

Problem 2: Nonsense temperatures
There are 79 instances when Acorn shows that the minimum temperature, the lowest temperature for the 24 hour period, was higher than the maximum temperature.

min max wils promThat dot at ‘1’ shows that on 5th December 1911 the coldest temperature was one degree hotter than the hottest temperature!

All of these occurred before 1950, so it is impossible to compare with the raw data.

The Bureau dismisses this as a minor hiccup of no importance, as an artefact of the adjustment process. The Bureau goes to great pains to explain how carefully the raw data was checked to remove any glaring errors and mistakes. On page 31 of CAWCR Technical Report No. 049, the section “Quality control checks used for the ACORN-SAT data set” describes a test for internal consistency of daily maximum and minimum temperature, which was carried out on the raw data of the ACORN-SAT sites. This test for minima greater than maxima, the first and most important quality control check, obviously was not applied to the adjusted data at all, and these nonsensical values remain years after sceptics made the Bureau aware. Any organisation that can’t perform a basic quality control test on its product is incompetent, as is any Review Panel or Technical Advisory Forum that endorses it.

 

Problem 3: Artificial warming
Here are the neighbouring sites used.

Maxima: East Sale Airport, Geelong SEC, Laverton RAAF*, Orbost, Queenscliff, Cape Otway Lighthouse, Melbourne Regional Office*, Essendon Airport, Currie, and Ballarat Aerodrome.

Minima: Cape Otway Lighthouse, Kerang, Melbourne Regional Office*, Eddystone Point, Geelong SEC, Bendigo Prison, Swan Hill PO, Cape Bruny Lighthouse, Currie, and Ballarat Aerodrome.

On page 71 of CAWCR Technical Report No. 049 is the statement, “the potential still exists for urbanisation to induce artificial warming trends relative to the surrounding region, and it is therefore necessary to identify such locations to prevent them from unduly influencing assessments of background climate change.

Included in the eight stations not used in climate analysis because their records exhibit Urban Heat Island effects are Laverton RAAF and Melbourne. Even though UHI makes Melbourne and Laverton unsuitable for use in climate analysis, the Bureau still uses them to adjust the data at Wilsons Promontory!

 

Problem 4: Neighbours
Cape Bruny Lighthouse is on the far south east coast of Tasmania, and is 509 km south of Wilsons Promontory. Kerang is on the Murray River, 413 km northwest, in a dry inland area, as is Swan Hill, 468 km away. Were there no better correlated sites nearer?

 

Problem 5: Results of adjustment.
To compare the temperature record at Wilsons Promontory with its neighbours, as we don’t have daily data, we can only use monthly or annual data. A simple but reliable method is to calculate the difference between Wilsons Promontory and each neighbour. This is done for raw and adjusted anomalies from the mean of a common baseline period. If Wilsons Promontory compares well with its neighbours, the differences should be close to zero, and most importantly, in spite of any short fluctuations, there should no trend: Wilsons Promontory should not be warming or cooling relative to its neighbours.

 

Unfortunately there are no monthly or annual data before 1957 for Eddystone Point or Bendigo Prison, so comparison is further restricted.

 

Firstly, minima: Wilsons Promontory minus neighbours, annual anomalies from 1916-1945, raw data.
raw min diffs wils prom

The differences range from +2 degrees to – 2 degrees, so there is plenty of variance, but the bulk of differences are +0.5 to -0.5 degrees. The spaghetti lines can be averaged to show the mean difference.
raw min avg diff wils prom

While there are periods of significant differences (1924-26, 1958-60, and 1974) it is plain that the raw data difference shows zero trend, indicating good comparison between Wilsons Promontory and its neighbours. Now compare the differences following the 1930 adjustment:
raw v adj min wils prom

The Acorn adjusted record preserves the periods of large differences, but has Wilsons Promontory cooling relative to its neighbours by more than half a degree per 100 years. The adjustment was too large.
Here is the comparison for maxima (anomalies from 1936-1965).
raw v adj max wils prom

The raw data show Wilsons Promontory cooling a little (-0.13C per 100 years) relative to the neighbours, but Acorn overcorrects, resulting in warming (+0.18C per 100 years) too much compared with the neighbours.

 
Problem 6: Site quality
On pp. 22-23 of Techniques involved in developing the Australian Climate Observations Reference Network – Surface Air Temperature (ACORN-SAT) dataset (CAWCR Technical Report No. 049) by Blair Trewin, March 2012, we find:-
Standards for instrument exposure and siting in Australia are laid down by Observations Specification 2013.1 (Bureau of Meteorology, 1997). Among the guidelines are:
• Sites should be representative of the mean conditions over the area of interest (e.g., an airport or climatic region), except for sites specifically intended to monitor localised phenomena.
• The instrument enclosure (if there is one) should be level, clearly defined and covered with as much of the natural vegetation of the area that can be kept cut to a height of a few centimetres.
• The distance of any obstruction should be at least four times the height of the obstruction away from the enclosure. (This criterion is primarily directed at elements other than temperature; for temperature the last guideline is more important.)
• The base of the instrument shelter should be 1.1 metres above the ground, with the thermometers approximately 1.2 metres above the ground.
• If no instrument enclosure is provided, the shelter should be installed on level ground covered with either the natural vegetation of the area or unwatered grass, and should be freely exposed to the sun and wind. It should not be shielded by or close to trees, buildings, fences, walls or other obstructions, or extensive areas of concrete, asphalt, rock or other such surfaces – a minimum clearance of five times the width of the hard surface is recommended.

 
The following photos are from Dayna’s Blog, a fascinating blog about bushwalking in SE Australia. (Interested readers are encouraged to visit https://daynaa2000.wordpress.com/ for some excellent walking tour information and photographs.)

 
The first view is towards the southwest, towards the direction of the prevailing south-westerly winds.
WilsonPLighthousenSolarPanels notes

Note the large areas of concrete under and near the Stevenson Screen; the nearby rock walls, the nearby solar panels almost directly to the south of the screen.

 
The second photo is in the opposite direction and shows the proximity of a building, another rock wall, and the steep slope of the site.
wilspromphoto east

These photographs make a mockery of the Station Catalogue description, which calls it “a very exposed location”. There are several man made features which surely influence temperatures recorded. Jennifer Marohasy recently asked the Bureau whether the solar panels would reflect onto the screen. The reply was,
“The angle of the panels means that any reflection from the panels is likely to only intersect the instrument shelter for a small part of the day during a limited part of the year. As the instrument shelter is fitted with double-louvered wall panels, it is virtually impossible that a direct beam of light would be able to enter the screen. Further, it is unlikely that the solar panels are influencing the instrument shelter as the shelter is painted to reflect direct and indirect radiation.”

 
Yet in the Station Catalogue for Alice Springs we find this statement “The site was enclosed by a rock wall about 1 m high and painted white that would have interrupted wind flow and reflected heat.”

 
They cannot have it both ways. If a 1m high rock wall interrupts wind flow and reflects heat in Alice Springs, then surely rock walls and buildings, large areas of concrete, and solar panels, all on a downward sloping lee side of a hill, will cause artificial warming at Wilsons Promontory.
Wilsons Promontory is a far from ideal site.

 
Thus we see at Wilsons Promontory misinformation and lack of transparency through failure to supply digitised raw data to allow replication; incompetence through not using basic checks for data integrity, resulting in publication of the “world’s best practice” temperature dataset with minimum temperatures higher than maximum; use of UHI contaminated sites when making adjustments; use of distant neighbours from different climate regimes; over-zealous adjustments resulting in worse comparison with neighbours than before; all at a very poor quality site.
Half-truths, incompetence, lack of transparency, and unscientific practices are evident at many other sites. A proper investigation into the Bureau is overdue.

More on the absurd ACORN adjustment process

September 29, 2015

This is a Letter to the Editor of The Australian I sent recently, but not published.

Sir

Dr Jennifer Marohasy (Ideology adds heat to the debate on climate change, 29/9)  claims that sites prone to Urban Heat Island effect, such as Melbourne, have been used to adjust the temperature records at sites such as Cape Otway.

This is indeed absurd, but true.  Of the 104 sites used for climate analysis, 22 have been adjusted at least in part by comparison with sites whose artificially raised temperatures make them unsuitable for use in that same climate analysis.

The Bureau of Meteorology lists eight sites which are not used in climate analysis because their records exhibit Urban Heat Island effects: Townsville, Rockhampton, Sydney, Richmond (NSW), Melbourne, Laverton RAAF, Adelaide, and Hobart.

According to the Bureau’s “ACORN-SAT Station adjustment summary”, seven of these sites are still used as comparison sites when adjusting raw temperatures at other locations.  Adelaide is used at Snowtown and Port Lincoln; Townsville at Cairns, Mackay and Charters Towers; Rockhampton at Townsville, Mackay, Bundaberg and Gayndah; Sydney at Williamtown, Bathurst, Richmond, Nowra, and Moruya Heads; Laverton at Orbost, Sale, Wilson’s Promontory, Melbourne and Cape Otway; Melbourne at Orbost, Sale, Wilson’s Promontory, Laverton, Kerang, and Cape Otway; and Hobart at Launceston, Eddystone Point, Cape Bruny, Grove, and Butlers Gorge.

Richmond (NSW) is apparently the only site not used in the adjustment process.

Greg Hunt’s faith in the credibility of the Bureau of Meteorology is touching, but just as absurd.

More Rutherglen Nonsense

August 15, 2015

Jennifer Marohasy had an interesting post this week on further explanations by the Bureau for their weird adjustments at Rutherglen.  I was particularly interested in this graphic, which is Chart 3 on the Bureau’s station adjustment summary for Rutherglen.  http://www.bom.gov.au/climate/change/acorn-sat/documents/station-adjustment-summary-Rutherglen.pdf

rutherglen comp BOM

The Bureau is comparing Rutherglen’s raw minima with the adjusted data from Wagga Wagga, Deniliquin, and Kerang.  Three questions immediately spring to mind:  1. As Dr Marohasy points out, what is the Bureau doing comparing raw with adjusted data?  Of course they’re going to have different trends!  2.  Why is Kerang shown, when Kerang is NOT included as a neighbour station used to adjust Rutherglen?  And 3.  What difference does this make?

Time for a reality check.

This graph compares like with like: raw minima for Rutherglen and the same neighbours.  Note that only Kerang is warming, and Wagga Wagga is flat, but Deniliquin and Rutherglen are cooling.

rutherglen comp raw

This graph again compares like with like, the same stations but with adjusted data.

 rutherglen comp adjusted

You might think that this shows Rutherglen is now homogenised with the others correctly.  However, when we examine the differences in anomalies from the 1961-1990 means between Rutherglen and the others, we get this:

rutherglen comp differences ADJ

They still got it wrong!  The trend in differences should be close to zero.   Rutherglen’s adjusted record is warming too fast (+0.5C per 100 years) relative to the three neighbours used by the BOM in their explanation.

And note that since 1974, Rutherglen’s minima have been cooling relative to the others.  Perhaps that cooling they corrected for was real after all?

Even if Rutherglen needs to be adjusted; even if these three sites are adjusted correctly; even if Kerang is one of the stations used by the Bureau to adjust Rutherglen- the adjustments at Rutherglen are over cooked.

The “scientists” in charge of the climate change department in the Bureau deserve all the ridicule they get.

More than that- they are not to be trusted with the nation’s climate history.  We don’t trust their data; we don’t trust their methods; we don’t trust their results; and we don’t trust their motives.

Heatwaves: From One Extreme To Another

August 8, 2015

When Is A Heatwave Not A Heatwave?

When the Bureau of Meteorology defines it out of existence.

In his reply to me on behalf of Dr Vertessy, Bob Baldwin wrote:

“The Bureau has adopted a particular operational heatwave definition motivated by human health considerations, defined as a period of at least three days where the combined effect of high temperatures and excess heat is unusual within the local climate.  ……….The bulk of heatwaves at each location are low intensity with local communities expected to have adequate adaptation strategies for this level of thermal stress.  Less frequent, higher intensity heatwaves are classified as severe and will challenge some adaptation strategies, especially for vulnerable sectors such as the aged or the chronically ill.”

After some digging, I found this paper which describes the Bureau’s methodology used in their Pilot Heatwave Forecast:

The Excess Heat Factor: A Metric for Heatwave Intensity and Its Use in Classifying Heatwave Severity, John R. Nairn and Robert J. B. Fawcett (2015)

The method is quite easy to follow and implement, and I was able to replicate results for the 2014 Melbourne heatwave exactly and use it successfully for other single locations.   It is designed for use with AWAP gridded data of course to give forecast maps.  Note this is raw data, not homogenised.  I downloaded all data from Climate Data Online.

There are several steps.  Readers should read the paper for full details.  Briefly, using a daily mean temperature calculated by averaging the day’s maximum and the following night’s minimum, three-day means are calculated.  These are then compared by subtracting the previous 30 days’ daily means (as people acclimatise to changed temperatures in this period).  Differences that exceed the 95th percentile of all three-day means from 1971 to 2000 are multiplied by the three-day mean to give the Excess Heat Factor, which indicates heatwave.  This is then compared with the 85th percentile of all positive EHFs from 1958 to 2011 to give a severity index, and if it exceeds 3 times the 85th percentile this becomes an extreme heatwave event.

From the paper:

The intent of these definitions is to create a heatwave intensity index and classification scheme which is relative to the local climate. Such an approach is clearly necessary given the abundant evidence that people and supporting infrastructure are largely adapted to the local climate, in physiology, culture and engineered supporting infrastructure.”

Here are the results for Melbourne- with all its UHI effect of course.

Fig. 1: Decadal (running 3653 day) count of positive Excess Heat Factor (heatwave) days in Melbourne

Decadal cnt pos EHF days Melbourne

Fig.2: Decadal count of Severe Heatwave Days

Decadal cnt severe HW days Melbourne

Fig.3:  Decadal Count of Extreme Heatwave Days

Decadal cnt extreme HW days Melbourne

Notice how Melbourne heatwaves of all types have been increasing and extreme events are currently at the highest level “ever”.

How does this apply to various other Australian locations?  I decided to check with the extremes- the hottest and the coldest Australian locations, Marble Bar in the north west of W.A. and Mawson Base in Australia’s Antarctic Territory.

Fig. 4:

Map

The old Marble Bar station closed in 2006.  I have concatenated the old Marble Bar data with the new, from 2003. This makes very little difference to the calculations but extends the record to the present.

Fig. 5: As for Melbourne, decadal count of heatwave days

pos EHF days marble bar 2

Fig. 6:  Severe heatwaves

count severe HW days marble bar 2

Fig. 7:

count  extreme HW days marble bar 2

It is clear that local climate does make a big difference to heatwaves by this definition.  In fact, Melbourne has more extreme heatwave days than Marble Bar!

How does this method of detecting and measuring heatwaves deal with Marble Bar’s record heatwave of 1923-24?

According to the Australian Government’s website, Disaster Resilience Education for Schools at

https://schools.aemi.edu.au/heatwave/real-life-heatwave-stories

“Marble Bar in Western Australia holds the record for the longest number of hot days in a row: the temperature was above 37.8°C for 160 days in 1923-24.”

I count 158 days consecutively from daily data at Climate Data Online.  The total for the 1923-24 summer from 13 October to 19 April was 174 days.  That is indeed a long period of very hot weather.

Surprisingly, the BOM does not class that as a long or extreme heatwave.  Apparently, according to this metric, there were only four short heatwaves, one of them severe, and none extreme.  For the entire period, there was only one severe heatwave day – 3 February.

Fig. 8:  Marble Bar 1923-24 summer.  I have marked in the old “ton”, 100 F, or 37.8C.  Squint hard to see the “severe’ heatwave around 3 February, but the heatwave around 22 February is invisible to the naked eye.

EHF Marb Bar 1923 1924 2

Yes, the old timers at Marble Bar were pretty tough and would be used to hot conditions.  But not to recognise this old record heatwave when every day in over five months was considerably above body temperature is laughable.

For comparison, Figure 9 shows 182 day counts of days that were over 100 degrees Fahrenheit, or 37.8 degrees Celsius.  (The old record finishes in 2006.)

Fig. 9:  Running 182 day counts of days over 100 F.  1923-24 is circled.

Days 100F Marb Bar

Note there were two other years when there were more than 170 days over 100F.

Figure 10 is from Figure 16 in the Nairn and Fawcett paper, and is a map of the level of Excess Heat Factor across Australia during the heatwave of January-February 2009.

Fig. 10:  Figure 16 from Nairn and Fawcett (2014)- Excess Heat levels across Australia 21 January – 11 February 2009.

Fig16 from paper max ehf 2009

The area around Marble Bar has a level of between 0 and 10.  My calculations show this is correct- EHF reached 0.08 on 23 January- a mild heatwave.  Readers may be interested to know that maximum temperature was above 40 degrees Celsius from 1 January to 24 January, and minima were not below 24.3.

The authors, and their employer, the Bureau, are in effect telling Marble Bar locals their heatwaves don’t rate because they’re used to the heat.

Now I shall turn to the other extreme- Mawson.

Firstly, plots of the range of minima for each day of the year:

Fig. 11:  Scatterplot of minima for each day of the year at Mawson Base

minima v day Mawson

Fig. 12: maxima:

maxima v day Mawson

Fig. 13:  Decadal count (running 3653 day count) of days with positive Excess Heat Factor, i.e., by definition, heatwave days

Decadal cnt pos EHF days Mawson

Fig. 14:  Decadal count of days in severe heatwave:

Decadal cnt severe HW days Mawson

Fig. 15:  Decadal count of days in Extreme heatwave:

Decadal cnt extreme HW days Mawson

Apparently, Antarctica gets more extreme heatwave days than Melbourne, or Marble Bar!

Of course, critics will say this metric was never intended for use in Antarctica, and I agree: no one would seriously claim there are heatwaves there.  However, if heatwaves are to be defined as “a period of at least three days where the combined effect of high temperatures and excess heat is unusual within the local climate”, and NOT by comparison with any absolute threshold, then this analysis of its use there is valid.  “High” temperature by this definition is relative to the local climate, wherever “local” is. If this metric fails in Antarctica, it fails everywhere.

Conclusion:

The Bureau of Meteorology’s metric for heatwaves is a joke.  It may accurately detect heatwaves in the southern fringe of Australia, and a further use may be to support Dr Vertessy’s outlandish claims.  However, it fails to cope with different climates, particularly extremes.  A methodology that fails to detect heatwaves at Marble Bar, and creates them in Antarctica, is worse than useless- it is dangerous.

After 15 Weeks, the Bureau Responds With Non-Answers

July 16, 2015

On 30 March 2015, in response to some “interesting” claims made on ABC Radio by Dr Bob Vertessy, the head of the Bureau, I sent by the normal feedback channel four questions, summarised below:

Q.1: Can you please supply me with a reference to your data that show that the number one cause of death is heatwave?

Q.2:  Can you please supply me with a reference to your data that show five times as many very serious heatwaves today compared with the middle of last century?  Could you also please tell me your criteria for a very serious heatwave.

Q.3:  In what way can 38.9%, 36%, or 34.1% difference in quadratic change (between trends of the supposedly “raw” Australian Water Availability Project data and those of the ACORN-SAT dataset) be interpreted as “no difference”, “exactly the same story”, or “the same result”?

Q.4:  When can we expect to see the results of this further work (monthly and seasonal analysis of differences between AWAP and ACORN) published on the ACORN-SAT website?  If it is available elsewhere please refer me to it.  I am particularly interested in any difference in quadratic change in summer maxima between AWAP and ACORN-SAT, as this is relevant to heatwave analysis.

I followed up with reminder queries on 28 April, with an email to Bob Baldwin (the Parliamentary Secretary responsible for this farce the Bureau) on 1 May, a Formal Complaint on 18 May, another email query to him on 15 June, and phone calls to his office on 25 June and 10 July.  In this last phone call I mentioned that I would approach the Opposition Environment Shadow Minister (Mark Butler) if I didn’t get a reply soon.

A reply was emailed to me on Tuesday 14 July.

Unfortunately, Baldwin’s reply contains no straight answers, avoids answering questions, gives misleading answers, contradicts itself, makes debateable interpretations, has at least two links to references that are not valid, and makes no apology or explanation for the delay.

Here is the full text of Baldwin’s reply, emailed on Tuesday 14 July, followed by my comments.

“I refer to your email of 1 May 2015, concerning an email sent to the Bureau of Meteorology’s Queensland Regional Office regarding an ABC radio interview with the Director of Meteorology.

As I am sure you can appreciate, the Bureau deals with a number of important issues in the interests of the public, including many severe weather events across the country.  As such, the Bureau does not always have the capacity to provide detailed and tailored responses to the many individual enquiries they receive.  I have, however, requested that the Bureau provide a full explanation to the four questions you raised in your email dated 30 March 2015.  The responses are below:

1.  Heatwaves kill more Australians than any other natural disaster.  As outlined in Coates et al (2014), from 1844 to 2010, extreme heat events have been responsible for at least 5,332 fatalities in Australia, and since 1900, 4,555: more than the combined total of deaths from all other natural hazards.  Refer:

-Coates, L., K. Haynes, J. O’Brien, J. McAneny and F.D. De Oliviera (2014)Exploring 167 years of vulnerability. An examination of extreme heat events in Australia 1844-2010. Environmental Science & Policy, 42, 33-44, doi:10.1016/j.envsci.2014.05.003.

(http://www.sciencedirect.com/science/article/pii/S1462901114000999)

-Queensland University of Technology (2010) Impacts and adaptation response of infrastructure and communities to heatwaves: the southern Australian experience of 2009, report for the National Climate Change Adaptation Research Facility, Gold Coast, Australia.

(http://www.nccarf.edu.au/business/sites/www.nccarf.edu.au.business/files/attached_files_publications/Pub%2013_10%20Southern%$20Cities%20Heatwaves%20-%20Complete%20Findings.pdf)

2.  The duration, frequency and intensity of heatwaves have increased across many parts of Australia, based on daily temperature records since 1950, from when coverage is sufficient for heatwave analysis.  Days where extreme heat is widespread across the continent have become more common in the past twenty years.  Refer:

-Perkins, S.E., L.V. Alexander and J.R. Nairn (2012) Increasing frequency, intensity and duration of observed global heatwaves and warm spells.  Geophys. Res. Let.., 39, L20714, doi:10.1029/2012GL053361.

(http://onlinelibrary.wiley.com/doi/10.1029.2012GL053361/full)

-Perkins, S.E., (2015) A review on the scientific understanding of heatwaves- their measurement, driving mechanisms, and changes at the global scale.  Journal of Atmospheric Research, submitted.

There are many valid ways to characterise discrete heatwaves and warm spells.  The Bureau has adopted a particular operational heatwave definition motivated by human health considerations, defined as a period of at least three days where the combined effect of high temperatures and excess heat is unusual within the local climate.  This does not preclude the use of other heatwave indices suitable for various research questions.  The bulk of heatwaves at each location are low intensity with local communities expected to have adequate adaptation strategies for this level of thermal stress.  Less frequent, higher intensity heatwaves are classified as severe and will challenge some adaptation strategies, especially for vulnerable sectors such as the aged or the chronically ill.  Refer:

-Perkins, S.E. and L.V. Alexander (2013) On the Measurement of Heat Waves, J. Climate, Vol. 26, No. 13, pp.4500-4517. Doi: 10.1175/JCLI-D-12-00383.1)

(http://jounals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00383.1)

-Bureau of Meteorology Pilot Heatwave Forecast: http://www.bom.gov.au/weather-services/about/heatwave-forecast.shtml.

3.  As shown in the figure below, both adjusted and unadjusted temperatures show that Australia’s climate has warmed since 1910.  Most of this warming has occurred since 1955, when adjusted and unadjusted data are virtually identical.

BOM awap-acorn graphic

4.The Bureau continues to monitor and research Australian temperatures.  This work is ongoing, and not being conducted as part of a specific project.  Therefore, the work is undertaken as resources allow, and not subject to specific milestones and timelines.  However, all significant research will be published and made available in the scientific literature following its completion and peer review.

The Bureau of Meteorology puts a great deal of time and effort into producing research and services around climate variability and change.  The Bureau shares observations daily with the world and its research is peer reviewed and published in high quality international journals for everyone to see.

Noting the wide public availability of scientifically robust climate data and information, I encourage you to seek answers to questions through the publicly available information, such as the references provided above.  The Bureau can provide any further analysis and response on a cost-recovery basis, in line with Australian Government Cost Recovery Guidelines.

Thank you for writing on this matter.

Yours sincerely

Bob Baldwin”

Comments:

There is no mention of my Formal Complaint, just my first email to Baldwin’s office.

There is no apology, and no explanation for the delay in replying by either the Bureau or Baldwin.

Response to Question 1: 

Why doesn’t Dr Vertessy just admit he may have misled listeners by not specifying that heatwaves are the number one cause of death “in natural disasters”?

Coates et al do indeed show that heatwave deaths exceed those of other natural disasters since 1844.

Figure 1:   From Coates et al (2014)- heat related deaths 1844-2010 (click to enlarge)

Coates graph

However, they clearly show that the number of deaths (and much more so, the death rate) was consistently much higher in the first 75 years of last century than the past 40 years, and while the 2009 heatwave certainly caused a spike in the number of deaths, the mortality rate per 100,000 was eclipsed by the 1896 heatwave, as well as 1908 and 1939, and also 1910, 1912, 1914, and 1927.  It appears from this graphic alone that “very serious heatwaves” were more common in the past than recently.

Figure 2 shows the average daily death rate per 1,000 for Australia from 2002 to 2012 taken from Australian Bureau of Statistics data (monthly death rate divided by the number of days in each month).  It is clear that mortality peaks in late winter, and is lowest in summer (December – April).

Figure 2: Daily Mortality Rate per 1,000 Population, 2002 – 2012

Daily mortality

Unfortunately, cold spells are not recognised as natural disasters, as they occur every year.  Deaths from cold are not limited to hypothermia, or burial under snow, or crashes on slippery roads, or house fires caused by heaters.  Every winter the death rate rises significantly as the sick and elderly succumb to chronic cardio-pulmonary illness, influenza, and pneumonia.  Cold is the real “silent killer”.

Response to Question 2:

Notice how my question, specifically querying five times as many very serious heatwaves today compared with the middle of last century, which is what Dr Vertessy claimed, has been neatly avoided.  The Bureau merely states that heatwaves “have increased”.  Dr Vertessy’s outlandish claim cannot be substantiated.

After quoting Coates et al in answer to the previous question, the Bureau now claims heatwaves can only be analysed since 1950.  If that is so, we can ignore the 70% of all heatwave deaths that occurred between 1900 and 1949, as only 1,378 heat related deaths occurred between 1950 and 2010 (see Figure 1 above).  While exact figures are not available to me, it would be interesting to see the total for floods, cyclones, bushfires, storms, tornadoes, earthquakes and landslides for this period, and whether 1,378 heat related deaths exceeds this.  Does the Bureau not see this contradiction?

Further, Perkins et al (2012) finds increased heatwave trends in percentage of days per season are “confined to… southern Australia”, not “many parts of Australia” as claimed in Baldwin’s reply, which is therefore misleading.  The claim that “days where extreme heat is widespread across the continent have become more common in the past twenty years” is not supported by evidence in this reply, as the second Perkins paper referenced is not yet published.  True or not, this is irrelevant.

Unfortunately, the link to the third Perkins paper does not work.  The Bureau’s Heatwave Forecast appears to be based on a similar metric to that used by Nairn and Fawcett (2015) in calculating an Excess Heat Factor to identify and predict heatwaves.  This at least will be useful.

Response to Question 3:

Again, the Bureau has chosen to avoid answering my question, clinging to their meme of warming since 1910, which I did not dispute, and that the difference between AWAP and ACORN since 1955 is negligible, which also I did not dispute.  My question was whether this negligible difference was evident from 1911, which the Bureau’s own paper (CTR-050) shows to be false.

Response to Question 4:

A short answer to my query would have been “No”.  No analysis of the difference between AWAP and ACORN on a monthly or seasonal basis has been undertaken.  Apparently I am the only person to have done this, and my results showing massive differences in maxima trends, largely due to just two adjustments, have not been falsified.

The final paragraph of Baldwin’s reply could be paraphrased as “Don’t ask us any more awkward questions.  If you do, you can expect to pay for the privilege of waiting three months to get a non-answer”.

Dr Vertessy has failed to substantiate his claims.  After 15 weeks, the Bureau has been forced to make a reply, which avoids answering questions, gives misleading answers, contradicts itself, makes debateable interpretations, has at least two links to references that are not valid, and makes no apology or explanation for the delay.  Thankfully, it does give references to some papers that give some information on heatwave detection.

What a farce.  I am disappointed, but not surprised.

However, I do think Dr Vertessy’s forays into the media world will be much more carefully scripted in future.

The effect of two adjustments on the climate record

June 24, 2015

The warming bias in Australia’s ACORN-SAT maximum dataset is largely due to just two adjustments.

Last week’s Report of the Technical Advisory Forum’s review of the ACORN-SAT temperature reconstruction produced some rather bland, motherhood type statements.  However, hidden in the public service speak was a distinct message for the Bureau of Meteorology: lift your game.  Two of the areas I have been interested in are (a) whether individual adjustments are justified, and (b) the effect of these adjustments on national and regional temperature trends.  In this post I look at adjustments at just two sites, which are responsible for the single largest increase in national trend.

On page 17 of the Report we find the following graphic:

Fig. 1: Scatterplot of difference between AWAP and Acorn annual mean temperature anomalies.

scatterplot awap acorn mean diff

This is a clear statement of how much Acorn adjustments have cooled past temperatures, as AWAP is regarded as being only “partially homogenised”, and close to raw temperatures.   There is a considerable difference- more than 0.2 degrees- between the two interpretations of temperatures 100 years ago.

Mean temperature is the average of maximum and minimum.  In this post I shall look at just maximum temperatures, from 1911 to 2013.  The following graph is a plot of the difference between monthly Acorn and AWAP maximum anomalies, which I think is much more informative:

Fig. 2:

scatterplot awap acorn max months

Note there is a trend of +0.22 degrees / 100 years in the differences, indicating a predominance of cooling of earlier data; there is a very large range in the first 50 years, from about -0.7C to +0.3C, and one outlier at +0.4C, reducing to a much narrower band in the 1960s before increasing in the last 20 years; and the bulk of differences are negative before 1970.

Now let’s look at what has been happening in the past 35 years- in fact, in the satellite era:

Fig. 3: Monthly differences between AWAP and Acorn before and after December 1978

scatterplot awap acorn max phases

The trend in differences for the first 67 years is 0.33C / 100 years, but there is a very small tendency for Acorn to be cooler than AWAP recently- and the range of differences has been increasing.

That’s an interesting find, but I want to examine in more detail the effect of the adjustments which cause those differences.  Here are annual maxima in AWAP compared with Acorn.

Fig. 4: Annual mean of monthly maximum anomalies: AWAP and Acorn

graph awap acorn max

Again we see that Acorn has increased the warming trend from +0.59C to +0.81C per 100 years, an increase of +0.22C, or 37.3%.

However, the difference appears more marked before the mid 1950s.  The next graph shows the trends from 1911 to 1955 compared with the trends from 1956 to 2013.

Fig. 5: Comparison of trends in maxima before and after the middle of the 20th Century.

graph awap acorn phases

Note: while the trends of AWAP and Acorn are very similar (+1.32 to 1.4C per 100 years) since the 1950s- which the Bureau never tires of proclaiming- before then the plot tells a different story.  Acorn reduces the cooling trend by 0.44C per 100 years, a reduction of 86%.

How was this achieved?

On page 44 of the technical paper CTR-050 we find this explanation:

Returning now to maximum temperature, the differences between the AWAP and ACORN analyses show a marked drop in the early 1930s, with a sudden decrease of about 0.15 °C. This is most likely attributable to substantial negative adjustments between 1929 and 1932 in the ACORN-SAT dataset, indicating substantial discontinuities (expressed as artificial cooling) at a number of individual locations with a large influence on national analyses, because of the sparsity of data in their regions in that period. These discontinuities are mostly related to site moves that are associated with concatenated records for single locations. These include Alice Springs, Kalgoorlie and Meekatharra. Alice Springs, where the adjustment is associated with a site move in late 1931 or early 1932 from the Telegraph Station to a climatologically cooler site in the town, has a notably large “footprint”; at that time there were only two other locations within 600 kilometres (Tennant Creek and Charlotte Waters) which were observing temperatures, while the nearest neighbours to the west (Marble Bar and Wiluna) were more than 1200 kilometres away.

This large change between AWAP and Acorn is shown in the next graph.

Fig. 6: 12 month mean difference in monthly maxima anomalies

graph awap acorn diff 1930 drop

As I explained in my post in September 2014, Acorn sites are homogenised by an algorithm which references up to 10 neighbouring sites.  A test for the validity of the adjustments is to compare the Acorn site’s raw and adjusted data with those of its neighbours, by finding the differences between them.  Ideally, a perfect station with perfect neighbours will show zero differences: the average of their differences will be a straight line at zero.  Importantly, even if the differences fluctuate, there should be zero trend.  Any trend indicates past temperatures appear to be either relatively too warm or too cool at the station being studied.  My aim is to check whether or not individual adjustments make the adjusted Acorn dataset compare with neighbours more closely.   If so, the trend in differences should be close to zero.

I have tested the three sites named above.  I use differences in anomalies calculated from the mean of maxima for the 30 year period centred on 1931, or for the period of overlap if the records are shorter.  The neighbours are those listed by the Bureau on their Adjustments page.

Fig. 7:  Meekatharra differences from neighbours (averaged)

Meek acorn v neighbours avg

Note that the Acorn adjustment (-0.77C at 1/1/1929- the adjustment of +0.54C at 1/1/1934 does not show up in the national signal) is indeed valid: the resultant trend in differences is close to zero, indicating good comparison with neighbours.  However, since Meekatharra’s record starts only in 1927, two years of the Meekatharra adjustment cannot have had a large influence on the national trend as claimed.

Fig. 8:  Kalgoorlie differences from neighbours

Kalg acorn v neighbours avg

Kalgoorlie’s steep cooling compared with neighbours (from 170 km to 546 km away) has been reversed by the Acorn adjustment (-0.62C at 1/1/1930- the adjustment of -0.54C at 1/12/1936 does not show up in the national signal), so that Kalgoorlie now is warming too much (+1.02C / 100 years more than the neighbours).  Kalgoorlie’s adjustment is too great, affecting all previous years.

I now turn to Alice Springs, which ‘has a notably large “footprint”’.  Too right it does- its impact on the national climate signal is 7% to 10%, according to the 2011 Review Panel, p. 12.

Fig. 9:  Alice Springs differences from neighbours

Alice acorn v neighbours avg

Alice Springs, cooling slightly compared with neighbours, has been adjusted (-0.57C at 1/1/1932) so that the Acorn reconstruction is warming (+0.66C / 100 years) relative to its neighbours.  The adjustment is much too large.

And exactly where are these neighbours?

Tennant Creek (450 km away), Boulia (620 km), Old Halls Creek (880 km), Tibooburra (1030 km), Bourke (1390 km), and Cobar (1460 km)!

The site with the largest impact on Australia’s climate signal has been “homogenised” with neighbours from 450 km to 1460 km away- except the adjustment was too great, resulting in the reconstruction warming too much (+0.66C / 100 years) relative to these neighbours.  The same applies at Kalgoorlie.  Meekatharra’s record only starts in 1927 so its effect can be discounted.  These are the only remote Acorn sites that had large adjustments at this time.  All other remote Acorn sites open at this time either have similar trends in raw and Acorn or had no adjustments in this period.

The 37.3% increase in the trend of Australian maxima anomalies in the “world’s best practice” ACORN-SAT dataset compared with the “raw” AWAP dataset is largely due to just two adjustments- at Kalgoorlie and Alice Springs- and these adjustments are based on comparison with distant neighbours and are demonstrably too great.

If it wasn’t so serious it would be laughable.

Open Letter to Bob Baldwin

June 15, 2015

Dear Mr Baldwin

What does it take to get action following a formal complaint?

I draw your immediate personal attention to this matter.

It is now fully 11 weeks since I submitted four simple questions to Dr Vertessy’s office (Reference REF2015-089-17) , nine weeks since my follow up request with a copy to you, and four weeks since I made a formal complaint to you.  Sam Hussey-Smith of your office emailed me on Tuesday 19th May, saying he would “seek to get a response as soon as possible”.

Still nothing.

I may be a mere insignificant individual with a minor query, but surely I deserve to be treated with a little respect, and surely the Bureau of Meteorology, the Environment Department, and the office of its Parliamentary Secretary, all need to demonstrate transparency and public accountability.

Perhaps Dr Vertessy hopes I will get sick of waiting and will lose interest, saving him the embarrassment of an apology and a probable retraction.   He should not underestimate my determination.  The longer he delays, the more it looks as if he has something to hide.

I seek your urgent personal intervention to ensure an immediate response.

Yours sincerely

Ken Stewart

 

Here is my formal complaint, sent 4 weeks ago (18 May).

Dear Mr Baldwin

Formal Complaint re: Dr Bob Vertessy, Director and C.E.O. of the Bureau of Meteorology

It is seven weeks since I submitted four questions to Dr Bob Vertessy, Director and C.E.O. of the Bureau of Meteorology, through the Bureau’s feedback channels, and two full weeks since I followed this up with a complaint with a copy to your office.  The Bureau acknowledged receipt (ReferenceREF2015-089-17) and an officer of the Bureau has confirmed that my queries were indeed passed on to the Director’s office.  However, there has been no other response at all, either from the Bureau or from your own office.

Seven weeks, Mr Baldwin, seven weeks!  This is beyond simple negligence.  It is now in the realm of conscious breach of the Bureau’s own Service Charter for the Community proudly displayed athttp://www.bom.gov.au/inside/services_policy/serchart.shtml .

Dr Vertessy demonstrably fails to meet several elements of his own Charter, in that:

  • I have not been treated with respect and courtesy;
  • The Director has not been clear and helpful in his dealings with me, and has given no reason for delay;
  •  My enquiries, which it appears the Director cannot answer, have not been referred to an appropriate source;
  • The Director has not dealt with my enquiries and subsequent complaints quickly and effectively;
  • The Charter claims the Bureau will “Reply to your letters, faxes and e-mails within two weeks – on more complex issues, our initial reply will give you an estimate of the time a full response will take, and the cost, if any.”  While lower level officers reply courteously well within this time (usually within hours or at most days), it seems the CEO is above this requirement.

It seems the Bureau has a long way to go in its aim to “Develop a more streamlined system of handling your enquiries and feedback on our services”.

I therefore request that you act to obtain for me an immediate reply to my queries from Dr Vertessy.  I also expect his apology and an explanation for not meeting “acceptable standards of quality, timeliness or accuracy”.

Until then, Dr Vertessy’s lack of response speaks volumes about his own credibility as a scientist, a communicator, and the Bureau head, as well as the credibility and accountability of the Bureau of Meteorology as a whole.

Yours sincerely

 

 

 

Ken Stewart