A discussion proposal for the future

Whether you believe in global warming or not, few can doubt that the system of climate monitoring is broke and that public trust is at an all time low. Having observed the “system” from outside for over a decade now, it is very clear to me that there are a number of huge problems that need addressing. So, in the hope that we now have a president in the US, who has the will to address these issues, I will start to get the conversation going as to what needs to change.

Please copy and re-use!!

I’m posting this as a way to try to kick start the conversation and you are free to copy, paste and edit as you feel. Also, whilst I’m absolutely convinced we need a “firewall” between modellers and those compiling climate metrics, there may be other issues I’ve not considered and there may be better ways to achieve the aim than my current idea of an “institution”.


“Global warming” as an issue, started after the 1970s cooling during a period when there was a dramatic increase in monitoring and climate data such as ice cores. These showed that climate had varied dramatically, not just over the ice-age cycle but also within these cycles and also the very recent past. It therefore seemed likely that significant climate change could occur within the lifetime of people alive at the time.

One of the earliest predictions of Climate based on the cycles seen in the "Camp Century"  ice core predicted cooling, When it did not happen it was suggested rising CO2 was the cause leading to a belief that rising CO2 was causing large-scale warming.

One of the earliest predictions of Climate based on the cycles apparently seen in the “Camp Century” ice core predicted cooling, When it did not cool as predicted it was then felt that rising CO2 was the cause leading to a belief that rising CO2 had offset the predicted large cooling. This in turn suggested rising CO2 would lead to large-scale warming.

One of the earliest attempts to predict these potential changes was by using an apparent cycle seen in the Camp Century ice-cores. This predicted we were heading toward cooling. As a result international organisations started to develop to look at climate and large scale gathering of already available climate data was started. But when this did not show the predicted cooling, those in climate started looking for alternative explanations leading to the theory that rising levels of CO2 had offset the predicted cooling. With data from ice-cores showing that CO2 levels varied over the ice-age cycle, this theory that CO2 was offsetting the predicted cooling, morphed into a theory that CO2 alone was controlling global temperature and that rising CO2 would cause large-scale global warming.
There came into being international bodies like the IPCC and the COP conferences which saw as their prime aim to inform international governments and public about the dangers of climate change and (because many were environmentalists by name or nature) campaign for a complete elimination of the global fossil-fuel powered economy.


The main problems can be divided as follows: inadequate data, poor selection of personnel for the task at hand, poor quality and organisational aims that tended to either ignore or distort the poor data and ingrain into the system other issues such as poor & inappropriate selection of personnel.

Poor data

The simple truth is that the weather monitoring stations were never designed for the job in hand. Many are poorly situated, inadequately calibrated and poorly maintained even for the job of day-to-day weather monitoring. They were intended to measure the difference in temperature from day-to-day of a few degrees, not to measure the absolute temperature to within a fraction of degree reliably and repeatably for decades or even centuries at a time. The sites are owned and maintained by a variety of organisations with very different standards. Some are undoubtedly world class for their intended purpose. But it is very clear that this high weather monitoring standard is not the same as the standard required for long-term climate monitoring AND it is clear that many sites are not even good enough for short term weather monitoring.

Poor Selection of personnel

The first problem is that almost no one has ever been employed for the task of monitoring climate. Instead those charged with collecting data are usually employed by weather monitoring agencies which although similar many ways, are very distinct in terms of  the quality of absolute long-term measurements. Also data is then collected (with apparently no quality control system) by climate researchers whose primary job is to understand the climate rather than monitor it.They either do not understand or have no interest in the importance of data quality and the necessity of controlling and ensuring calibration of equipment is suitable for the intended purpose.
And even those who took on the task of compiling the climate data were not chosen for their skill at data handling. For example, one of the climategate emails revealed that Prof Phil Jones who compiled HADCRUT, had problems using a simple spreadsheet.
Those involved were not employed for the data handling abilities nor their understanding of the complexity of instrumentation engineering. Instead they tended to be academics or academic-like researchers, often from a computer modelling and/or environmentalist background with no experience or understanding of meteorological instrumentation.

No Quality system & extremely poor quality control

Recognised international quality systems like ISO9000 require robust systems to ensure control over procedures and systems particularly relating to calibration of equipment, control of processes and data handling and storage and selection of staff. Key to the system is regular auditing by trained individuals to ensure compliance of staff and process to the documented quality procedures.
It seems unlikely that any of those compiling any metric of “global temperature” have even a rudimentary quality system let alone one to a recognised international standard. This has created a number of issues:

  • No control of the quality of climate data coming in and no system to track issues or remedy poor quality data coming into the system.
  • Apparently random, ad hoc & seemingly unexplainable changes, that even if done for good reasons, give the appearance of fiddling the data to produce a desired outcome.
  • Poor data retention leading for example to the loss of historic data by the University of East Anglia, which makes it impossible to verify the quality of any records based on this data.

Inappropriate organisation aims

If an organisation has aims that are incompatible with the task it is intended to perform, then irrespective of the personnel and procedures, it will not perform that task as well as it should.
There is no doubt that almost all those (with perhaps the notable exception of Dr. Roy Spenser at UAH) see their role as proving to government and the public that the world is already warming and demonstrating that immediate action must be taken to “turn off” the fossil-fuel powered global economy. These aims are those of a campaigning organisation set on changing the nature of the global economy. As such many have adopted the standards and working methodologies of campaigning organisations.
For example, in the UK, those compiling “global temperature”, were working hand in glove with those who worked with the press to ensure that any abnormally high temperature readings received the maximum press publicity. This resulted in a distinct relationship between the the date of release of temperature data and the temperature indicating the release date was being manipulated to fit the publicity requirements. Whilst this did not in itself show manipulation of temperature, when there is such a close relationship between those compiling data and those “manipulating the media” to push a particular viewpoint, it invariably means the “manipulators” will also put pressure on their own staff to “deliver the goods” to help with the publicity. At the very least, it is extremely likely that those compiling the data will be asked to “look again” and any “adverse” readings whilst favourable readings will be welcomed. Within such an environment of praise for certain trends, and condemnation for those producing adverse trends, it is inevitable that the organisation will tend to favour the individuals and procedures that produce trends favourable to the organisational aims.
nasasurfacetemp1981-1999-2014Another related issue occurred in NOAA and NASA. Having produced climate models that predicted warming, the organisations were then faced with over a decade without significant warming. Rather than admit their models were very likely wrong and thereby improve the models and predictions, which would have been the scientific way to handle the situation, the political pressures on these organisations to show warming was occurring, meant they were compelled instead to find news ways to calculate “global temperature” which were compatible with their predictions (and the then current US administration).


Create a global monitoring network designed for the task in hand

As an instrumentation engineer who has designed laboratory temperature monitoring and control equipment, I know that even within laboratory conditions it is difficult to get a measurement to within 0.1C. As someone who has installed and monitored weather stations (ironically for windmill sites), I knew the temperature would be affected by insects rain, sun and realistically there was no point recording the value to better than 1C.
Even with a very big budget (£millions) and good staff, I would be hard pressed to create even one temperature station that I could hand on heart say could measure local temperature change to ~0.1C over decades. But if someone asked me to do it globally, I’d honestly have to say that I am not up to the job (how Prof Jones who had no relevant experience thought he could do it is beyond me) because it is such a complex issue involving world beating precision instrumentation engineering, and understanding of site selection which involves meteorology and ecology. And because it is global, it also needs negotiations with countries worldwide to put aside considerably quantities of land** for the sole purpose of producing long-term reliably metrics.
However it needs doing. And whilst I am inclined to a surface based system I might go for a combination of satellite and ground based monitoring. But the design itself is a hugely complex issue. (Not something that can be done down the pub as seems to be the current methodology).

Create firewall between climate modelling and climate measuring

One of the huge problems of climate, is that due to the slow rate of change, the subject is data poor, but due to the huge global interest, it is rich in “research” interest. The result is that there is much more “research” being done than there is data available to support, particularly when it comes to single metric like “global temperature” (there are at best 12 data points a year). This has led to a plethora of groups creating various predictive models, based on very much the same data. So, the area is rich in speculation and poor in data that can confirm or refute the speculation which inevitably takes an inordinate time.
With huge political pressure and academic pressure to publish, speculation has run years, decades and in many cases, centuries ahead of the available data.
And with new data taking years or even decades to become available, there has been considerable effort put into finding new “better” methods to analyse the current data. For obvious reasons, any researcher in control of that data has tended to favour methods that tend to make the data and their models fit each other. The result has been huge pressure to find new methods of handling the data and a selection for those methods that make the already available data better fit the predictions of the models (and the requirements of policy makers).
Therefore, we must create a “firewall” both between climate modellers and those compiling climate metrics and between those compiling the metrics and politicians. In other words, create entirely separate and apolitical organisations, both in the party-political sense, but also in the sense of removed from the academic politics of climate. With most data modellers being academics, and with academics not renowned for their adherence to recognised quality standards, this means that the compilation of climate data should not be undertaken by academics.

Create institutions with appropriate goals and quality system

We therefore need to create organisations with aims appropriate to the task in hand. Let me contrast two possible aims as they might be construed by academics and instrumentation engineers

Academic: to work to understand climate and the effect of human change on it

Instrumentation Engineer: to design and build a quality system capable of measuring changes in the global climate, reliably and repeatedly within +/-0.2C, (or appropriate standard, and similar also for rainfall, windspeed, pressure, humidity, insolation, etc.)

The appropriate organisation will be some kind of “institute” or indeed multiple bodies with staff selected for their  and meteorological instrumentation skills. It’s prime aim will be to produce quality climate data. It will not be involved in modelling the climate (except in so far as this is necessary for site selection) and it should not in any shape or form, produce public statements predicting the future course of climate.

Future of IPCC

The IPCC is a mess. It is ostensibly a governmental body but it relies on the free contribution of many researchers, which tends to select those who have the biggest axe to grind on the issue of climate, rather than those who are best at their job.
There is a need for an international scientific body to study the scientific evidence on climate and produce this in a form that is understandable to those without suitable qualifications. However, rather than do this “on the cheap” with the result that those with a campaigning passion – or even directly employed by campaigning organisations tend to get involved, it needs funding to ensure the best researchers are willing to contribute.
There is also a separate need, for an organisation or organisations to take those predictions of climate, and understand what they mean in terms of government policy. No doubt given the number of people who want to have a say, such bodies need little assistance to help them form. But again, there needs to be a firewall, between the assessment of science and the development of policy based on the science.
As such the IPCC, should be reformatted. Anyone who has or in future wishes to campaign on environmental issues should be removed, leaving only those with a disposition to look at the evidence from an impartial scientific standpoint. These also need to be paid for their time contributing to ensure the best researchers are willing to give up their time.
One key role the IPCC (or a replacement organisation) should undertake is to assess the validity and predictive skill of any climate models. This is something that is currently not done. Indeed, I will go as far as to say, it is something that has been intentionally hidden so as to prevent the embarrassment of admitting the failure of previous predictions.
However, if the IPCC (or its replacement) is to learn to be more modest and sober in its speculation, it is critical that it assesses its own previous performance and takes remedial action.

**I would suggest each station needs at least 1km2 of land under direct control, with perhaps a total of 100km2 with some say over land-use change to stop the issue of rural and urban heating caused by local changes in plant evaporation levels.

This entry was posted in Climate, My Best Articles, Politics, Proposals. Bookmark the permalink.