Peak Global Warming

A bit of personal forward planning
As far as public interest is concerned, global warming peaked in 2007 (google trends). As far as academic (aka “scientific”) credibility is concerned it peaked in 2009 with climategate, not so much for the clearly dubious behaviour of those involved revealed in the emails, but because their behaviour was not only condoned by a massive cover up at the highest level of “science” took place to keep the global warming scam going – and their continued politicking and denial of mother nature and her pause has only driven the stake home.
But from a data point of view – the peak is odd. Based on long term cycles it is likely  that global warming peak around 2010. That is to say, if we average over 30 years the maximum is likely to occur with an average from 1995- 2025. But, obviously, we don’t have that data yet, nor will we have it for some time – so that is still in the “hypothesis testing” stage. So we are stuck with the shorter records which appear to tell a very different story.
Because in the shorter yearly records the short-term el Nino spikes have a pronounced effect such that it is almost certain that the “peak” of global warming will occur in an El Nino year and so far the most likely contenders are 1998 and 2015/16. And based on current data 2016 is not matching up to the record of 1998 (wouldn’t it be ironic if the peak were 18 years ago – it’ll really make the assurances by the idiot academics look all the more ridiculous in the history books!).
So, now I trying to plan my year. I don’t want to have a lot of time available in the “muddle” of a El Nino when all we will get is moronic copy-n-paste commenters claiming the “pause is ended” just because its an El Nino year. Nor do I want to be tied up with other things at the very time the El Nino falls down the other side and we are wiping the smile off the face of the copy-n-paste alarmists.
So, I have a real personal interest in trying to work out or predict the end of this particular scam.
When will we know it is over?
Checking the data on the UAH graph, the earliest indicator would be around 1year of temperatures below “average” (pseudo 0C). Alternatively any single month below “=0.3” would be indicative of cooling and below “-0.4C” has only occurred before the “pause” … but hang on!!! … I’m thinking like a sceptic. Sure we’d need a whole year below 0C to start being convinced … but the poor alarmist are such sensitive souls!!! I’m sure even one month below 0C would have them quaking at the knees, throwing fits and throwing their rattles out of their pram.
Looking back at the graph, it appears that from the peak, it takes roughly a year for the global temperature to drop down.
Likely Future
Ok – here’s the likely scenario:  at some point we should get our 2015/16 “peak” and we’ll know that has occurred with moderate confidence when temps drop 0.2-0.3 below the peak. By my rough estimates that is a 3-4month lagging indicator – and from that point onward until around a year after the peak there is (usually) a strong decline in temperatures. If I assume peak temperatures occur around March, that suggests June/July will be the start of “making hay” with a peak-global-warming-eco-nutter-totally-pissed-off-with-mother-nature-not-doing-what-she-is-told toward the end of 2016 and throughout 2017.
From 2017 to 2020 we’ll then get back to “natural variation” but likely continuing the “19,20,21,22 years of pause” with an increasing likelihood of someone shouting but it’s actually cooling …
Around a decade later than everyone else (2025?) – the academics will start saying they have “discovered” a “60 year cycle” – which somehow explains both the pause … they’ll award the idiot who “discovered” that a Nobel to claim it as “theirs” (it won’t be cycle – like pause is to hiaitus –  it’ll be something more grandiose “modulation” – no that’s not long enough … it needs to sound “scientific” sound as if some 150 year old Latin teacher spat it out it needs to be very “anharmonious'”) … but I digress.
Whatever bullshit name they call it, it will be claimed that “when this is taken into account that we are seeing ‘unprecedented warming’.” Then around 2030 we’ll start to see warming again and the idiot academics will claim they have been vindicated and that this is finally “proof that private industry is ‘evil’ – because they are causing people to not get quite so cold in winter … or if they can afford the fuel … not getting anywhere near as cold as the eco-nutters (with high paid public sector jobs) would like them to get”.
Note: by 2030 – will the university sector still have anything like the same authority it once had? With foreign Universities effectively taking over (largely because our Universities are now so filled with PC idiocy that they will not be able to compete) it may no longer matter what US and particularly UK university staff think.
Contingency futures
Even with unusual warming, it is unlikely that we’d to get anything like the idiotic predictions of the eco-nutters in the IPCC (0.6C since 1996). Warming or cooling of around 0.5C in 30 years occurs 0-2 times a century. Warming like the 1690-1730 warming of 2C – has only occurred once in 3.5 centuries in CET. Thus, it is likely in the next century we will see warming or cooling of around 0.5C in 30 years and there is around a 30% chance of warming or cooling of up to 2C (over 40 years) in the next century.
Unusual cooling: it is quite possible we could see an “unusual” cooling event (i.e. it looks unusual in short-run data). For the sake of argument, lets assume this is 5 years of colder than I would expect. This will certainly cool the ardour of the global warming idiots. However based on previous behaviour, this will merely turn to “cooling is just another thing we expect from man-made global warming”. And just as the same people pushed the global cooling scare as pushed the global warming scare, so the same evil machine pushing global warming will be simply turned into a global cooling scam.
However, I suspect that although it will be very easy for academics to convince themselves they have “found” something, I can’t see the public falling yet again for another global-bullshit scam in the same way they did. However, the (once) mainstream press will no doubt use anything global-bullshit scam to go overboard in a vane attempt to boost their dwindling sales – but at least that might focus on real problems like winter deaths.
Unusual warming: just as unusual cooling is quite usual (yes the irony is intended), so unusual warming is also quite likely. Unfortunately, with the fraudulent believers inventing surface “temperature”, unless their is a sane republican in the US who has a general clear out of the eco-nutters it is almost certain they will just “get rid of the pause”, like they “got rid of the medieval warm period” and “got rid of the 1940s warm period” and “got rid of the 1970s global cooling scare”.
So, a long-term warming event would require a huge effort to prevent the “lunatics finally not only taking over the asylum – but knocking down the asylum with themselves and a lot of other people in it”.
It’s all down to the public’s common sense
The big question: at what point will the public have the sense to turf out the lunatics running the asylum? But I forget – most of the public are already treating global warming fanatics like deranged lunatics! The problem isn’t the public – who are rightfully sceptical of all academia tells us – it’s the deranged lunatic politicians who keep lapping up their insanity. And the problem (in the UK) is that there’s been effectively no choice (although will UKIP change that?)
So, here I must leave the subject because if there’s one thing I cannot predict – it is when deranged politicians will realise they’ve been conned.
 

Posted in Climate | 11 Comments

8 Reasons why global surface temperature is crap

There are many reasons to believe the temperature trend is much cooler than the corrupt figure given for surface data.
1. The data is massively adjusted (equivalent to all the warming since 1940)
2. The stations just happen to have been located conveniently on the edge of urban areas – the areas that were mist subject to changes in the 20th century as urban areas increased.
3. There is proven poor siting (which causes additional warming to rural). The problem here is that they turn a nod and a wink to poor sites that help increase the warming trend where a quality organisation would have dealt with them.
4. Then there is the multiple instances of site specific tampering with data – which looks like individuals changing sites to cause additional warming to everything above.
5. Then there is the oft change in methodologies to select those that show the highest warming.
6. Then there is the way they intentionally remove sites with a cooling trend from the data (possibly as part of 5).
7. There is the fact the temperatures are far from global with e.g. most of central S.America and Africa being missed out.
8. Then there is the fraudulent misuse of ocean temperature data (changing good data without a warming trend to fit bad data with a warming trend).

Posted in Climate | 4 Comments

A complete explanation of the ice-age cycle.

This article is one of a number I’ve been doing developing my ideas on the Ice-age cycle. I will try to summarise the key ideas and explain how they fit. However, with such a complex subject and so many new ideas I cannot cover everything. Instead I will focus on the key ideas and ask you to read other articles where necessary.

Summary

The Caterpillar theory is just a statement of science – in that warming and cooling will cause expansion and contraction and that must modulate volcanic activity. The only part needing additional evidence is to show that heating and cooling of just the top few kilometres (the most rigid part) dictates the plate movement. If so, the 40 -100k time delay between ice-age cycles represents the time it takes for heating and cooling to penetrate the crust.
I have explained in previous articles how the Milancovitch orbital changes can trigger expansion. Because when the crust contraction is slowing down as the temperature equalises at the end of the ice-age, a very modest induced warming from the change in orbital cycle can turn contraction into expansion. If this expansion then leads to the emission of volcanic gases, and that in turn leads to further warming, this explain how a very modest Milancovitch cycle change in temperature is a “trigger” for a massive additional change.
However I still needed to find a plausible explanation to link volcanic emissions to the runaway warming that takes us from an ice-age to an interglacial. For this the emission of volcanic gases, must also trigger further warming, but also it is necessary to explain why this “runaway” warming stopped – because unless there is a mechanism that stops warming as we go into the interglacial – warming could not stop and the world would fry (which is why runaway global warming in an interglacial is nonsense).

Fig xx

Clouds


It is well known that the ice-age periods are not only colder but drier than the inter-glacials. Therefore a very likely candidate for stopping this runaway warming is increased water on the land surface. And the most likely mechanism for this is that increased water in the earth’s atmosphere leads to increased cloud cover – which in turn leads to reduced transmission of heat and light through the cloud cover blocking the suns heat. Thus a small increase in water near to the “saturation” level of the atmosphere can rapidly lead to clouds that stop further warming (as per a sunny day). Thus water can create a very stable “end stop” effectively preventing further warming.
But it has been problematic explaining how  volcanic emission led to warming. No single effect is enough to explain the ice-age cycle. CO2 has been suggested, but the direct effect of additional CO2 is very modest (and certainly not sufficient to explain the massive warming). Previously I had tentatively suggested  CO2 induced changes in plant growth which in turn moderated H2O (a far more potent greenhouse gas). The suggestion here was that CO2 promoted plant growth (plants love- CO2) and in turn, because plants take up water from the soil and release it to the atmosphere, the increase in plant cover would result in a large increase in atmospheric water vapour. Water vapour being a potent greenhouse gas would then massively increase the greenhouse effect from the very modest contribution of CO2.
However, given the small land surface relative to oceans, although it is possible to envisage a large change in atmospheric H2O across the land surface, it was not possible to envisage a similar change across the oceans. Thus the scale was still not large enough to explain the full change in ice-age temperature.
I have also suggested changes to the Hadley cell structure (and re-enforced by changes in ocean currents) – whilst a change in Hadley cell would lead to dramatic changes in climate and also a change in oceanic currents which tended to re-enforce any change (so it would tend to be bi-stable) – the mechanism explaining why this change would occur in the first place was missing.
Now however, if we add a change in atmospheric pressure to the mix, we not only have a direct greenhouse warming effect due to the change in the pressure of the atmosphere, but also increased pressure substantially promotes plant growth, and also pressure has a direct impact on atmospheric physics providing a plausible explanation for a flip in the Hadley Cell structure.
Put together I feel that the scale of warming is now sufficient to explain the massive warming seen as we come out of an ice-age.

Introduction to (real) Atmosphere greenhouse model

(Note: this is not the called “skydragon” model)
The (real) atmosphere greenhouse theory is a model developed to explain & incorporate both the heat-trapping properties of so called “greenhouse gases” as well as the pressure induced greenhouse effect.
One of the main advantages of this model is that it avoids the non-scientific aspects of the “heat trapping” model where CO2 (and other greenhouse gases) are only considered as trapping IR (an assumption that breaks the laws of physics because real molecules like CO2 both emit and absorb CO2 – and this & other non-physical aspects of this model cause many people to dislike it passionately)
In the real atmosphere, greenhouse gases both emit and absorb IR. And the scale and sign of this effect is a function of the temperature of the CO2. This important science is ignored in the heat trapping theory. The result is that the heat trapping model glosses over very important aspects of heat transfer in the atmosphere with the result the implied “atmosphere” has no  mechanism to explain or maintain its own temperature – a very serious problem when this supposedly the main parameter that it attempts to explain.
So this key parameter – the temperature – is undefined, unconstrained, unrealistic and so, whilst all models are just simplifications, the heat trapping model is particularly bad as it breaks some very basic physics. This does not mean there isn’t a greenhouse effect – it just means the heat trapping model distorts the reality of what greenhouse gases actually do and thereby leads to false conclusions.

Removing all the irrelevant components from the Penn State "opaque atmosphere" model, we are left with the above.

Removing all the irrelevant components from the Penn State “opaque atmosphere” model, we are left with the above.


The (real) atmosphere greenhouse model treats the atmosphere as a real body with a temperature profile caused by the lapse rate (as shown schematically above). And the “greenhouse” effect is the difference between the temperature at the effective radiative layer and the ground.
The key is to understand that the key temperature of the earth is not the surface – but instead it is the temperature “as viewed from space”. This temperature is easily calculated from a simple black body model of the earth.
The main concept to help explain how the greenhouse effect works is to view the atmosphere as having an “effective radiation level) – which is a virtual layer at the average height at which radiation leaves the earth across all radiation frequencies. Or in terms of the model: it is height at which the atmospheric temperature is the same as the average temperature of the atmosphere viewed from space.
Because we know the temperature of this virtual layer – and we know that for most IR frequencies emissions occur from near the top of the cloud layer, it is easy to see how the temperature profile from this virtual layer down to earth creates the greenhouse warming effect.
But, because there is a great deal of variability of the opaqueness of the atmosphere at different radiative frequencies the effective height at any frequency can vary dramatically. So this “layer” is only a “virtual layer” representing an average across all wavelengths.
However, as air movement and cloud formation is one of the main heat flows in the bulk lower atmosphere and as they form clouds which are one of the key vectors for emitting heat away from the earth. The atmosphere is largely opaque below and transparent above this layer. So  this “virtual layer” can be very loosely equated with the tropopause. (the top of the main cloud layer).

Implications of (real) atmosphere greenhouse model

Thus the “greenhouse” effect, is the difference in temperature between the “radiative layer” above the bulk of the atmosphere and the surface. This is a like any insulator such as a coat – there is a heat gradient which creates a barrier between a cooler and not region. But in the atmosphere there is a fixed thermal gradient because the temperature change is mainly due to a physical property of the atmosphere: the (largely) fixed temperature change with height.
This temperature change is because of the potential loss of energy of air. To explain simply: one 1kg of air to rise 1000m, 10,000 joules of energy are required. In the atmosphere, this energy comes from heat. So heat is lost as air rises and turned into potential energy. If we know the thermal heat capacity of air we can calculate the lapse rate. This figure is around 9.8K/km for dry air – ~6.5K/km for moist air.
Thus., the greenhouse temperature can be changed in two ways:

  1. By a greater quantity of gas – so that the air is more dense and the atmosphere is thicker which causes the effective radiation level to rise (hence a greater height and so potential difference between the surface and this “layer” – thus more energy changes from heat to potential if air rises to this level)
  2. By changing the mix to include more radiative opaque molecules like CO2 and H2O. Which being opaque tend to increase the effective radiation height thus increasing the temperature rise from this “layer” down to the ground.
    (The best analogy here I have is to imagine a wood of trees winter – with a few evergreens like holly. If the wood is the “atmosphere”, then the hollies are greenhouse molecules. If there are no hollies we can see quite some distance into the wood. But even a few holly trees will dramatically change the distance we can see. So, even a few holly trees can dramatically reduce the distance we can see into the wood. Similarly, though few in number, a small change in CO2 can significantly change the effective radiation height.)

Implications of the real atmosphere model to long-term climate

Whilst all models are just simplifications, how we simplify a system can be a powerful scientific tool (and in turn also a political tool).
The problem with the heat trapping model is that it implies “an imbalance” and thus promotes the idea of”runaway warming”. The (real) atmosphere greenhouse model is just another simplification of the same far more complex climatic system. However, because it sees the addition of CO2 as a small step change in the effective radiative height this rightly sees the effect as a small change from one equilibrium position to a slightly different one (about 100m-400m for a range of 0.6 to 2.5C).
However, sometimes models also provide insights that were not obvious: I now realise that a simple truth had been staring at me through this model. I had just been focussing on the change in composition and its effect on temperature, but in doing so I was ignoring the other important effect and whilst composition is important, volume is by far the bigger potential change. So it follows that where we see a temperature change of the earth – the first place we should look for any “culprit” (unless there is strong reason against it) is in the pressure (or volume) of the atmosphere and not gaseous composition.
CO2CollapseSo, when we look at the average temperature of the globe, such as the graph to the left which is intended to show temperature versus CO2  as a first approximation we should consider the average temperature to be a proxy for average pressure. That means a rise in temperature is most likely indicative of a rise in the pressure of the earth’s atmosphere (or as the two are linked: volume), and visa versa a drop in pressure is mostly due to a drop in pressure.
Obviously the composition and particularly the relative composition of the atmosphere is important. Also the mechanics of cloud formation are also important. But generally we can describe the temperature as the combination of two effects. So we might find an equation of this form:

T = ε f(P)

Where f(p) is given by an approximation similar to the empirically derived formula shown below:

Click to see original WUWT article


And ε is an effective emissivity representing the change in relative composition in IR interactive gases (an effect which is itself a complex function of gas composition, pressure and other non-linear effects like cloud formation).

Implications to Ice-age cycle

Below is plotted a proxy for global temperature showing that the global temperature has changed significantly in a cyclic way for at least the last 2.5million years.
Five_Myr_Climate_Change_expand1According to the (real) atmosphere greenhouse theory, there are two main factors (three if we count clouds separately) that can change:

  • Total quantity of the atmosphere
  • Relative amount of IR interactive (or Greenhouse) gases.

And, unless we have evidence to the contrary, not only must we consider both possibilities, but by far the stronger effect comes from total volume of gas in the atmosphere.

The leaky atmosphere hypothesis

For reasons which are obvious given the necessary escape velocity of ~40,000 km per hour and the infinitesimal amount of the atmosphere that reaches this speed (even at the elevated temperatures in the ionosphere) the earth is usually considered a sealed container from which nothing can escape.
However, that is usually falsely assumed to mean that nothing can escape the atmosphere – which is then taken to mean that the atmospheric volume is constant. An easy mistake to make – but upon reflection given the huge bulk of the earth and the thin layer of atmosphere, it is just implausible. Because if for no other reason:

if the atmosphere is a sealed container, how then an the composition the atmosphere change?

The answer of course, is that whilst the atmosphere is sealed into space, it is far from sealed into the ground.

The inverted earth

The Inverted earth model (here height = 1/real height)

The Inverted earth model (here height = 1/real height)


If this isn’t obvious lets have some fun! Models are important and so in order to visualise the real earth, imagine we redraw the earth “inverted”. So, now instead of the centre of the earth  (as left) being a fiery ball, the centre is now the cold of infinite space, and now instead of the “outside” being the cold of space, the “outside” is the earth’s interior. Now, the “sealed container” idea of the atmosphere starts to look ridiculous. Because whilst the atmosphere obviously cannot leak “down” into the impenetrable interior – it can and obviously does leak into and out to the earth’s crust.
OK, that is just some fun, but conceptually, when we start considering the atmosphere leaking down into the ground, the limit that atoms cannot easily escape into space has disappeared.  Now the limit is how much atmosphere goes “out” into the ground and comes “in” from it. And now there are many mechanisms whereby gases can and are lost:

  • Volcanic activity
  • Burning fuels
  • Decomposition
  • Nitrogen fixation by plants
  • (very slightly) Acid rain from CO2 and chemical reactions.
  • The pesky snails (and other shelled animals) stealing carbon and depositing in rocks
  • Oxidation of minerals and carbon.
  • etc.

Hence there are very good reasons that over geological periods we may see dramatic changes in the earth’s pressure. And if the atmosphere were “leaking away”, this would explain the long cooling we see as we go from inter-glacial down to the coolest part of the ice-age just before it is triggered into warming. But what then causes this warming?

The caterpillar theory

Thanks Josh cartoonsbyjosh.com

Thanks Josh cartoonsbyjosh.com


For more depth see here
The caterpillar theory simply states that when the earth’s surface heats up the crust also heats up and expands – and that forces the top layer of rock at plate boundaries to be pushed together, so that one plate is forced down below the next. It also states, (drum roll) that when the earth cools down (trumpet call) the rocks ….. (can you guess) … yes they contract (sorry this is just so obvious). This in turn causes the rocks to be pulled away from each other. And this effect is most pronounced at mid-oceanic ridges.
And there has recently been found very strong evidence this is happening from a modulation of the rock types at the mid-oceanic ridges.
Volcanic activity is involved in both parts of the cycle, but because we have tow types of volcanic activity subduction (and rocks pushed down where there is thermal decomposition) and simply contraction, the two parts of the cycle promote different types of rock formation (and through thermal decomposition of surface rock with lots of carbonate rocks and hydrocarbons) the various parts of the cycle favour the release of CO2 and hydrocarbons (like oil and gas).
plate-EruptionIn addition the theory states that the time between cycles is determined by the time it takes for the “hard” surface rocks to expand/contract + the process of heating is part of a runaway warming effect initially triggered by a modest amount of warming from the Milancovitch cycle. (but please see more detailed explanation)
However, the problem with the theory as originally stated was that there was no plausible mechanism whereby the triggering of volcanic eruptions would lead to the necessary change in temperature. (The relative change in CO2 concentration being far far too small).
At the time I hypothesised two other mechanisms:

  1. A change in the Hadley cell structure (see explanation here) – something very hard to explain – because it required a dramatic change to the atmosphere
  2. A change to the H2O cycle whereby CO2 promoted plant growth, which in turn vastly increased transpiration and then led to a massive increase in cloud cover.

However, neither looked promising as a total explanation. The first required a massive change with no obvious mechanism to trigger that change. The second, whilst a very good explanation as to why rapid warming by hit a “hard stop” (so that the interglacials were always at roughly the same temperature) it required a massive change in greenhouse gas effect from H2O.

The changing atmospheric pressure hypothesis

When I originally introduced the idea that atmospheric pressure might have caused the ice-age cycle I used the Niklov Zeller equation and calculated that the required pressure change was around 30% based on following graph (note normal day-to-day changes of pressure are around 10% – so this is large):
PressureGlobTempHowever, as no one else was even considering long term changes in atmosphere I was very cautious and only offered it as a suggestion. However after reflection and now that now that I understand the full implication of the (real) atmosphere greenhouse model and that any change in temperature (in geological time periods) is likely to also mean a change in pressure, I should not just speculate but formally include a change in atmospheric pressure into my ideas.
Now, the ice-ages would be a period of lower general pressure. The inter-glacials would be periods with higher overall pressure. However, because pressure works with other effects the total pressure change required to cause the ice-age temperature change may not be anywhere near the 30% originally needed. Because:

  1. We now have the significant change in atmospheric properties which would be needed to change the Hadley cell structure.
  2. Atmospheric lower pressure in itself would tend to dry the world (the water vapour pressure is constant – so lower atmospheric pressure allows more evaporation).
  3. Higher pressure would also dramatically add to the CO2 and H2O greenhouse properties. So, not only would the release of CO2 increase the effective emissivity (ε in above equation) – but by increasing the volume and pressure of atmosphere IN ADDITION f(p) would increase.
  4. An increasing atmospheric pressure + increased CO2 would both increase plant growth and that in turn would increase transpiration and release of H2O into atmosphere. And as H2O is a far more potent greenhouse gas that CO2, this effect is very significant.

Brief Explanation of how pressure effects Hadley cells.

To explain the Hadley cell we must understand that winds on the surface of the rotating earth are subject to the Coriolis effect. This effect tends to cause air flows to move to the right in the northern hemisphere and to the left in the south. The scale of that effect is given by the Rossby number:

R = U/fL
Where U is speed, L is length and f = 2ωsin (φ) (related to angular velocity of earth and latitude)


The lower the R value, the more the atmosphere is affected by the Coriolis effect. Or to put it another way for the same R value (and same latitude & day length)

L U

Thus higher wind speeds tend toward larger Hadley cells. And with a reduction in atmospheric pressure, not only is there less air resistance (zero in a vacuum) but there is less thermal mass to move heat through the atmosphere – so for the same heat flow wind speed must be higher. As a result, reducing pressure tends to increase average wind speeds and therefore tends to push toward larger Hadley cells. And if we are on the edge between a 3/6 Hadely cell structure and 1/2 as I suggest, then even a modest change in the Rossby number might trigger us to flip. (Note there can only be an odd number of Hadely cells in each hemisphere due to warming at the equator and so rising air – and cooling at the poles.)

 Remaining issues

A few issues remain. There is no evidence for atmospheric pressure change – largely I feel because no one has even considered it plausible that we might be seeing such changes. (When I searched there were no papers on the subject)
Also CO2 tends to lag temperature rise in the ice-core. A possible explanation is that the initial volcanic release of gases are of non-CO2 gases and/or plants very quickly absorb CO2 retaining the low level – so that pressure rises whilst CO2 levels remain constant – and perhaps only when enough CO2 has been absorbed do processes of decomposition start to take off and release it back into the environment.

Posted in Advanced Greenhouse Theory, Caterpillar, Climate, Ice age, My Best Articles | 4 Comments

Those approving wind lunacy to face personal liability

I’m sure that sooner or later the legal profession is going to realise that with some $trillion in value being made from this scam – that there’s an awful lot of ways they can make a huge amount of money by arguing for the “rights” and “compensations” deserved by all the many groups and companies involved.
And because it was just one enormous “snouts in the trough” rush to make money, not many cared or even knew about the massive personal liability that they have taken on. There was not careful consideration of the risks they took on – most had no idea of their own personal liability.
For example, Phil Jones – who was ruled to have broken the law in Climategate – has been personally vouching that we can all rely on him (the bloke who can’t use excel) and that based on his many years of perfect forecasting (none) that we should all spend billions on bird killers & other environmental destroyers – when even a casual look at the figures will see would be far far far far far far better spent on hospitals or other facilities EVEN IF EVERYTHING HE WAS RESPONSIBLE FOR WAS RIGHT (which it is not – kids won’t know what snow is …).
So, the last and perhaps the most ugly and vicious phase of this scam was always likely to involved the lawyers (the copy-n-paste media no longer having the balls to do investigative reporting).
And now it seems that dam has burst:

Germany: City Council Members Approving Wind Parks May Face Personal Liability For Damage To Health!

The legal winds are shifting! Many city councils and wind park planners are going to have to clean up their acts when pushing their pet wind park projects.

Germany’s Fundamental Law specifically expresses that the State is obligated to protect the life and physical body of the individual, foremost from illegal attacks by others.

Consequently, according to German legal experts Prof. Michael Elicker and Andreas Langenbahn here, city councils approving the installation of wind parks may be held personally liable to damage to health of persons who live close to them.
– See more at: http://notrickszone.com/2016/02/08/germany-city-council-members-approving-wind-parks-may-face-personal-liability-for-damage-to-health/#sthash.gDcwHFbs.dpuf

The significance is not this individual case – it is that it marks a change in the attitude of the legal profession. And those are the ones the scamsters really need to fear!
How to put this … let me put it this way: every engineer knows it is relatively easy to defend your calculations when the proverbial bridge is standing. But what is difficult is to defend your calculations when the bridge has not only fallen down – but someone has died – and some corrupt material supplier is now hiring a very expensive lawyer to pin the blame on you – and all you have left in your defence is your hastily prepared calculations written a decade ago before you even knew it was going to be built.
Likewise, it was all too easy for Phil Jones to claim the world was falling apart when he had fooled himself – and could still count of the support of every other academic to back him in saying that … by a bit of bad data that the world was falling apart. What however isn’t going to look quite so compelling in court, is defending statements that the world is going to fall and a lot of people MUST put a LOT OF MONEY into projects based on his “expertise” … when the world clearly did not fall apart, when his predictions look totally stupid if not corrupt … when he has refused to come clean, when he personally has made a lot of money off the scam … it doesn’t take a very good lawyer to spin that into a very nasty perhaps even politically inspired fraud.
This is why I keep saying academics are like lambs being taken to the slaughter – and it is why most sceptics are engineers. Because no engineer ever says “the bridge will stand up” (or to put it in climatic terms – it must fall down). Instead they will only say: based on our calculations assuming the materials are as specified, our best judgement is …”. Caution, Caution — data-defendable caution.
And what do acedemics engage in … crying wolf, time and time and time again and never stopping not even after the evidence clearly contradicts then.
And likewise – just as an engineer needs to be able to defend themselves when the bridge falls down – to put that in a legal context. The law on liability for the penalty for pushing the “global warming scandal” will not be made when everything looks rosy in “Phil Jone’s Garden”. Instead it will be made when the public mood has completely changed, when the evidence is pretty obviously contradicting Phil Jones … at a time no doubt when the public are baying for blood – baying for someone to pay back the billions of wasted money.
And for engineers who are always the first people to blame when something goes wrong – their culture impels to caution as a defence against the inevitable law suits when (usually not for the engineer’s fault) things do go wrong.
But academia is a culture which has no real concept of caution, or quality – where it is believed getting a buddy to give a nod and wink to a paper is all that is needed to make something “true” – where in short they think the worst think that can possibly happen is to have a paper withdrawn and a bit of ribbing from their in-the-same boat colleagues. So, these poor lambs, who haven’t any involvement outside academia – who thought that being an academic made this somehow immune to prosectuation – there will be a very unwelcome shock waiting around the corner in the form of those all too vicious lawyers who will make mince-meat out of them.

The importance of this latest article

OK, this latest article is starting at the edges – it’s changing the perception of liability. but it is a very important event because it shows:

  1. People are seriously beginning to think about suing
  2. That means the tide has turned and people are beginning to believe they can win a court battle for compensation.
  3. It can only get worse – because once one set of lawyers realise that there’s a bountiful harvest in this area of law – the idea will quickly spread.

Now the only questions left are these: how quickly will the lawyer-sharks smell blood? Will they spot the easy prey floundering in the sea and at what point will they move in for the kill? And last but not least – exactly who will they attack – because like this case – it is not necessarily the obvious victims who are the first ones to be attacked – nor in the obvious places.

Posted in Climate | 3 Comments

Long -Term Climate Change: What Is A Reasonable Sample Size?

In WUWT Tim Ball asks a simple question:

Long -Term Climate Change: What Is A Reasonable Sample Size?

And the answer is fairly simple.
For a reasonable degree of certainty (~75% – but see end) One needs around 10x the length of time of data of the length of time in which we are taking a trend – and all the data must be from one homogeneous source. So, e.g. in order to assess whether the last century was abnormal, we need around a millennium of data. In order to assess whether the 1970-2000 warming was abnormal, we must compare the 1970-2000 trend in CET with the last 300-350 years in CET. And the longest period in the raw instrumental dataset of ~160years that can be assessed for abnormality would be around 16years.
So, if you hear anyone say “the pause is ‘abnormal'” – that might supportable (but is not). If however, they say “the last 30 years is abnormal” or even “the last 100 years” – they are either stupid, fraudulent or insane.
However, it all falls apart if we start comparing apples with cheese: tree rings for 1000 years with bogus fraudulent surface data for 30 years. Becasuse, for example, if we look at what kind of change is normal over the last 1000 years in the tree ring data – then we must compare it the same data for the last 100 years. But in contrast – the whole “hide the decline” scandal, was that, not only weren’t they comparing tree ring data with tree rings, but they knew that the tree ring data showed entirely the opposite trend from that they were stating to have been shown to be “abnormal”. With fraudulent behaviour there is no way to make their assertions credible by merely adding to the (bogus) data.
However, as 10x the data is problematic when we need quicker indications, I would suggest that we can get a “more likely than not” indication for 3x the period. But now it is critical that those doing the assessment come from the right background (which means tried-and-tested-engineering and not woolly-pc-panic-stricken-by-any-change academia.
So, e.g. with 160 years of data, we (engineers) can start saying with a modest certainty that if the last 50 years showed warming that had not been seen before in the last 160 years, then something was odd. But, just to show how ridiculous that assertion would be, even using the bogus upjusted data, the 1970-2000 period shows the same warming as 1910-1940. So, there is no indication of any abnormality with the global temperature (despite the known upjusting which in itself tells us just how normal the present period is – that even fraudulent changes can’t change it enough to make it abnormal).

The rational for long periods of data:

Until we know what is normal we cannot know what is abnormal

To take a simple example, we have two flight computers on a space craft – one says “full throttle”, the other “cease throttle”. How do we decide which is correct? The answer is that unless we have additional information there is no way to even guess which is correct.

If however, we have three computers, two say “cease throttle” and one says “full throttle”, then all other things being equal, then if the chance of any computer being wrong is p. Then the chance of two being wrong is p2

So, the chance of two computers being wrong as opposed just one is p2/p. So as p<1, then irrespective of the actual value of p, it’s always more likely than the minority is abnormal.

Likewise, at the very least, we need three centuries/decades of data to even start guessing which decade is “abormal”.

So, why 10x the length of data? The reasons are many:

The rule of 10

In essence, this rule simply means we need an awful lot more data than we think we need by “academic” statistics – because the real world is full of real people who just don’t think in the way needed for “academic” statistics to be valid.
The biggest problem is that we usually start looking at data when something “odd” appears. (Or to put it another way we ignore data where nothing odd appears) And, by pure chance, if we continue to monitor data, for long enough or from enough different sources, sooner of later by pure fluke, we will see an odd “event”.
As such, when we start assessing the risk of something like “climate change” we are not just picking data at random. Instead we have already “cherry picked” a period which appears odd. So, by pure probability, if we monitored 100 metrics, one of of those 100, should have a signal that only occurs 1/100 of the time in that signal. In that case, we would need 100x the data before that 1/100 signal would be within a sample where it was likely to occur. (but even then  another such event should have occurred, so there is twice the probability of this event than would occur by pure chance – just because we only focussed on something that appeared a problem!)

But the untrained human factor gets worse

But, even with simple data, there are so many ways to take the same metric and suggest “abnormality”. Taking temperature, we can for example look for “hottest” and “coldest” (2). Also “faster warming” and “fastest cooling” (2). Then we have the possibilities of turning points(2) and cycles(>2)** – all of which can be construed as “odd”. So, even with simple data, there are around 10 different ways to see something “odd”.
So, quite contrary to what the statistics supposedly suggest, it is actually “normal” to see a 1-in-100 year event in a decade of temperature. It is also normal to see a 1-in-1000 year event in a century of data.  So, if you are just looking for something “odd”, in around 10 different metrics, the chances are you will see a 1-in-millennium “event” every decade!!

The human factor

So, if we are intent on finding something “odd” in even one dataset, the chances are quite high and that is why we need long time series. If however, we have a host of datasets (floods, droughts, snow, temperature, rain, hurricanes, peak-rain, peak-wind, peak, rain, etc. etc.), then if we are allowed to cherry pick as academics have done, then we are guaranteed to find something abnormal.
This is why one needs to be properly trained in engineering practices to do risk assessments. Because the biggest quality failing of risk assessment is the idiot doing the risk assessment – and particularly if they don’t come from a culture used to doing risk assessments and living the with result of either overstating risk or under-stating it.
So, even with the best of intentions, and even with 10x the data of the period being assessed, the best we might say is that there is “more chance than not” of some data being abnormal, if (as has happened) you have politically motivated groups free to scour the data and worse – free to channel resources – with the intention of finding “something wrong”.
If however, you have people trained in risk assessment from a suitable culture, who know the temptation to cherry pick data and have the training, experience AND culture to resist, then the the certainty with 10x data can rise as high as perhaps 90% confidence. (Note the idiots at the IPCC have stated 95% confidence, about a period equivalent to the length of their whole dataset – so whilst they have no idea & no data to say what is normal – they are 95% sure that what they have is abnormal).


 
**To explain, if we accept a turning point is simply a variant of a sin-wave (with period twice the sample length), then a trend is a variant of a cos-wave. Then a simple cycle (up-down-up-down) is just twice the frequency. However, if the probability of this is half as high (there being twice as much “info”), then if we sum the total series, the probability of the total is around 1. However, because we are looking for things “happening” … we can often accept cycles that only appear later in the data. So, there is quite a high chance of seeing something “odd” in the form of an apparent cycle.

Posted in Climate | Comments Off on Long -Term Climate Change: What Is A Reasonable Sample Size?

Review of possible mechanisms for long-term climate change

Two apparently contradictory headlines caught my attention today:

To which my response was “of course trees cause climate change” – that is to say real climate change (not what the copy-n-paste journalists call “climate change”).
So, I’m going to take the chance to summarise my current thinking about the causes of global climate change. So, I’ll start with my own response to the above article:

“of course trees cause climate change”

Continue reading

Posted in Climate | 6 Comments

Oh dear! The "science" is settled – so 350 jobs go.

When I joined industry, one of the first things I was told was “always look busy” – because any department that didn’t look busy was bound to be the target of any redundancy.
Obviously climate researchers in Australia never got that advice – because for years they’ve been saying “the science is settled” – which in short means they completely understand the climate and there is no need for the public to pay them to do further research.
We sceptics have frequently been heard to say “if the science is settled why can’t we just get rid of these annoying researchers” … and so it seems that  Malcolm Turnbull the Australian PM has had just the same thought. Now 350 jobs are going with nearly 70% of some departments being booted out.
… and how can they argue … as afterall they’ve been saying (or condoning) the statement that the science is settled … which either means they are lying (and should go) or are really no longer of use (and so should go).
I doubt very many academics will ever again support the view that “the science is settled”. Indeed, I imagine most hearing this news will realise that what happens in Australia could easily happen anywhere else. And so unless they find some uncertainty left to investigate, they too will be out of a job.
My thoughts:

  • We actually need better research on the climate – and if we could have good research and researchers (not eco-idiots pushing their eco-political views dressed up as numbers) then I would argue for more spending.
  • You’ve got to laugh – I think the phrase is “hoisted by their own petard”.
  • There seems to be a bit of a trend to “downsize” climate related work. There’s also growing scepticism. Not sure how much of this is because climate is an easy target – and it’s full of idiots who can be easily cut – and how much is because of growing scepticism – or indeed a combination of easy target and not much interest.
  • Supposedly more effort is being put into “mitigation”. My experience is that the hot heads on climate are in the mitigation side (so no hard science and full of left-leaning eco-fascists who want to dictate to the rest of society the “ZEE MUSTE” do Vat I say. I suspect by “mitigation” what is really meant is “giving engineers money to build flood defences and dams”.
Posted in Climate | 8 Comments

The chances of there not being a climate scare right now!

After my last post, musing that I cannot hope to ever discern the human influence of CO2 on the climate – I started thinking about the chances of such stupid scares occurring by pure chance because the climate always has such trends. My thoughts went thus:
In any civilisation that reaches a stage whereby it starts acquiring global data so that it can measure climate globally, there will be by necessity an enormous number of changes any of which could (by the unscrupulous) be tied to climate. And particularly soot, aerosols, chemicals from industry.
So, whereas the modern world is a lot lot cleaner than a few generations ago, the environmental “consciousness” of a world-wide perspective would mean that groups would use these changes to frighten the public. So, it is almost certain that groups – like academia and fiends of the earth would have been ready to abuse science and jump on any change in climate.
Therefore, what is the probability of such change. Let us suppose the climate has three states: pause, warming, cooling (and remember there are many other naturally changing variables that can be cherry picked!!). At the present time we have around 30 years of satellite record & 50 years of CO2 – or around 3-5 decades of real “data”. Because methods change, it is relatively easy to change the more distant past to re-enforce ideas that things are “changing”. So, what is the chance of seeing “something” in 3-5 decades? Lets use 3 decades:

  • Trend: Full Cooling/warming  = 2/27
  • paused trend: warm/cool – pause – warm/cool 2/27
  • pause going to trend: pause then pause or  warming/cooling then same = 4/27
  • Turning: warming then warming or pause or cooling then cooling = 6/27

So, total ways of getting “scary” change = 14/27
As “scares” always use the last few decades – and as it seems to me only three decades would be enough to convince most people that the “experts” had predicted “the” trend (turning point). It seems that the best estimate of the probability of a climate scare – in the first few periods of measurement – is around 50%.
On average, once this scare becomes engrained, the chance of it continuing is therefore 66% (because as we see, the group-think and associated bandwagon effects diverts funding to the alarmists and diminishes the authority and funding for sceptics). So, on average the probability of the length such scares after they start is:

  1. One decade 33% (leaving 67%)
  2. Two decades 22% (leaving 45%)
  3. Three decades 15% (leaving 30%)
  4. Four decades 10% (leaving 20%)

So, at a rough guess, 50% of societies inventing worldwide measurements will trigger a scare in any one measurement (but there are many such measurements so perhaps the truer figure is closer to 100%). And in general, it is likely that such scares/scams will last around two decades.
In other words, given that the global warming scam is now well on the way out, we are perhaps quite typical – no better – but no worse than would be expected (given the stupidity of academics and others who falsely believe they somehow stop these scares because of their “expertise”).
 

Posted in Climate | 2 Comments

The chances of getting Climate extremists' predicted warming

 
I would just like to share with you a thought. Having calculated that there should have been around 0.6C warming during the pause, I was just musing what the chances of getting this warming be coincidence would be. And I suddenly thought “it warms or cools by about 0.1C per decade by chance … so what is the chance of having 6 decades in a row to get 0.6C? It’s 1/2^6 = 1/128.
But this is the humorous bit – I then realised that in those six decades the target would have risen to 1.8C. So, then I thought “what if I use the higher end of warming we’ve seen of 0.2C …. then I thought what is the maximum multi-decadal warming we’ve seen. It is of course from 1690-1730 in CET when we saw about 0.5C/decade. So, now the chance is that we’ve seen (at least 4) decades of 0.5C warming over 35 decades. So, the chance increases to 4/35 = 11% (however we might expect regional variation to be greater).
So, I think we can safely say the upper chance of the actual temperature (not the fraudulent surface data) showing warming of >0.5C per decade is 10%. As warming is as likely as cooling (give or take a few ice-ages and CO2) the 50% probability is at 0C/decade. And as most years (from 1850 – in what I now know is fraudulent surface data) show around 0.1-0.2C warming or cooling, the 1 standard deviation must be around 0.1-0.2C/decade.
The minimal warming the extremists need is 0.3C (to account for two decades of pause) – but if that occurs in a decade, they will need around 0.45C (as it should have warmed even more). So they actually need >0.45C warming over the next decade.
At a rough guess, based on my figures, I would suggest around a 20% probability of 0.3C warming and nearer 10% of >0.45C of that  minimum warming occurring by pure fluke (with no CO2 involvement).
So, my best estimate is that, by 2025 – there’s only a 10-20% chance that mother nature will give the climate criminals what they need to defend their claptrap.
As for CO2 warming
My best estimate is 0.5C warming for a doubling of CO2 – which will take many decades/centuries to come through. CET shows (suggests?) a 1/10 chance of 2C warming by pure chance. So, even if the REAL climate showed 2C warming in a very specific period of 4 decades, then the best we can say is that we are 90% confident that this short-term rise of 2C is not natural.  But if we extend the period long enough it will always happen by chance – so the confidence falls dramatically over longer periods.
But even the massively adjusted data (so that all warming since 1940 is due to adjustments) only shows 0.8C warming, and not over a short period of a few decades but over the whole century (till the pause). That is the century of a massive change in measurements from manual to automated measurements. Even if we look for the most “obvious” trend in the period 1970-2000 (also the period of most heavy adjustments) it only shows 0.48C warming (which happens to be same as 1910-1940!). As I stated above, based on CET we would expect around 10% of decades to show this warming. So in 150 years we expect around 1-2 decades of 0.5C warming!!!
Even if we now saw two decades of 0.5C warming or cooling – it would just be “normal”!! (although I’d have to recheck the scale of variability between CET and global – all the more difficult when the surface data is fraudulent!)
So, to summarise – I cannot think of any test – when we’ve already seen change, which could demonstrate the minuscule level of CO2 warming “expected” would be man-made. There would need to be more than 1C warming in a few decades (about 2C total warming) to say even with a modest certainty that the signal is human caused and not natural. In contrast, rather than warming we’ve seen “the Pause”. They needed fireworks – they got a cold shower!
Because there is a slight reduction in long term variability, we probably need around 100-1000 years of future data so that the scale of natural variability reduces sufficiently to “prove” any change of 2C was “human causation”.
In other words – unless I am completely wrong and the climate is subject to massive positive feedbacks for warming (so >1C warming in which case I’m probably dreaming this whole world as the world would have burned up yonks ago) … so in any real world (and not fraudulent data), there’s no chance of me ever knowing whether CO2 has had an effect within my lifetime.

Posted in Climate | 1 Comment

Crunch Year for global warming (= the scam)

2016 will undoubtedly be the crunch year for “global warming” – undoubtedly the biggest scam in the whole of history – pushed relentlessly by academia and the public sector against the interest of the western economies.
The satellites are the only credible measure of global temperature (the surface data being nothing short of fraudulent with so many different warming adjustments only a liar or an idiot would use it). And up until the start of the peak of the El Nino induced warming the satellites continued to show no warming for 18 years. That is 0.6C of warming PREDICTED by the climate extremists of the IPCC that did not happen.
Not only do the eco-extremists need this El Nino warming – but also after the El Nino, they need the temperature to continue upwards. It is already tittering of the brink – they are desperate to stop us saying 18 (soon 19,20) years without warming. If for a few months they silence us – the disappointment when it returns to cooling will surely be the end.
In contrast, the sane sensible people of this world, know there is a correlation between solar activity and temperature, that there might be a 60 year cycle peaking in 2010, thus explaining the small real warming from 1970-2000 (as opposed to the fabricated warming added to the surface data). And as a consequence, it is more likely than not (because what we know about the climate is much less than what goes one) to see cooling, not only straight after the El Nino peak, but for the next few decades.
So … here are the scenarios
1. Warming continues after El Nino (very unlikely)
This is the only scenario permitted by the Global Warming religion. However, because sceptics know we don’t know much about the climate, we can’t dismiss further natural warming as a possibility – albeit less likely than cooling.
2. Ambiguous “pause” (trend <0.05C/decade)
If the temperatures return back to the pre-El Nino level and continue the “pause” – albeit with a small warming trend, then the climate extremists will be shattered – their hopes dashed – as the pause lengths so that it will soon be the same length as the entire 1970-2000 “unprecedented” warming (the same warming occurred from 1910-1940 and there was far more warming in the Central England record from 1690-1730). However, the extremists – like all extremists will never admit defeat – and the anti-science eco-journos (like the BBC) will continue to pour their evil into the ears of the populace. Yes, the science will have disproven the global warming bullshit – but when you’ve got 1000x the funding of sceptics and lying cheating academics ready to churn out paper after paper … they don’t need science to maintain their belief.
3. The unambiguous pause (trend <0.0C/decade)
If however the temperatures return to a lower temperature than before the El Nino – then the game is up for these extremists. Cooling (of any kind) disproves their religion.OK – like all religions they will continue believing “the end is nigh” – but the sane sensible majority will stop listening.
However – the surface data will be continually upjusted to “prove” it is warming by the fraudsters. So, for real scientists – the pause will be unequivocal – but to the fraudsters, liars and gullibles who follow the religion – the fact we are not seeing warming (nor any other trends like extreme weather, floods, droughts) as they predicted will just be an inconvenient truth -easily explained away with fraudulent data and denier papers.
4. Unambiguous cooling (observable cooling)
When someone can produce a graph that most people will accept shows cooling, then the game will be completely up – even for the fraudsters. Unfortunately, because we know so little about what causes climate change, that could happen in as little as one year or as many as 100 years (or more). All we know is that cooling will occur and make the stupid episode of “man-made” warming look ridiculous. So, we can say with certainty that “history will not be kind to the climate extremists”, but we cannot say whether it will be us, our children, our grandchildren (or more) who will be reading that history.
So, within this scenario are several sub-scenarios to do with timing:

4i – warming then cooling (within a generation)
4ii – warming then cooling (over more than a generation)
4iii – cooling (within a generation – then warming/cooling/warming etc.)
4iv – cooling (over a period greater than a generation)

I’ve not put a scale on these – because they will be judgemental.  There is always warming and cooling in the climate – that has not changed – it’s just every so often the climate variabilities add together to create an apparent trend over a decade or more. So, the key will be when the general populace accept the warming trend has turned into cooling.
The best possible scenario for us sceptics, would be for the globe to show a sustained cooling trend for the next couple of decades (although not necessarily the best option for humanity as it will undoubtedly increase human misery and death). If that happens, then within my (expected) lifetime, I will be celebrating the collapse of global warming, the idiots who pushed it in academia, the BBC and e.g. “Royal” society.
That in itself might trigger a broader social revolution!
Ambiguity rules
However,the real nature of natural variation as seen in the climate (I have just realised) is that ambiguity rules (so this is almost an addendum). Unlike “scientific” (or white noise), which has no trends, cycles etc., the problem with real (1/f) noise is that it is full of apparent trends, cycles and steps. And as such – perhaps I should start calling it “ambiguous noise”. That is to say, 1/f noise, is noise that appears to have a signal – but does not. It appears to have trends – but does not. It appears to have “meaning” but does not.

Scientific (white) noise vs. Ambigous (1/f) noise

Posted in 1/f, Climate, science | Comments Off on Crunch Year for global warming (= the scam)