Blatant Fraud

I am sick to death with seeing blatant fraud in the global temperature figure as constantly revealed by Real Science. What is more, from my experience, when an organisation is so blatant with its dishonesty, there will be far far more to discover.
GISSFigA2Changes-March-Sept2015
The only consolation is that no one except the extremely gullible is now fooled by these graphs.

This entry was posted in Climate. Bookmark the permalink.

3 Responses to Blatant Fraud

  1. In other fields, where certain evidence is pivotal to a particular case, establishing quality control standards, questioning the strength of that evidence, looking at alternative sources of evidence and independent evaluation are the norm. For instance
    1. Evidence presented in a court of law.
    2. Testing of pharmaceutical drugs to make sure they work, and evaluating adverse side effects.
    3. Analysis of mineral prospecting data, to make sure small samples are representative of the true state of affairs.
    In all of these areas different interpretations of the data are possible. Further people can be biased towards certain interpretations more than others.
    – A police detective wanting to put away a terrorist of suspected rapist.
    – A researcher exploring a new drug that could save thousands of lives.
    – A climatologist wanting to save the world from an impending global warming cataclysm.
    But in climatology there is no recognition of these very human biases. In fact it is accepted as an a priori truth that a large amount of warming is in the pipeline.
    Leaving aside any deliberate shenanigans, the raw surface temperature data is of variable quality, is geographically of variable density and has measurement biases are unknown, There is also the issue that temperature trends across the Earth’s surface don’t just vary in magnitude (e.g. parts of the Arctic has many times the warming trends of near the Equator), but in time and even in direction. Further there are local variations due to micro climates as well. So how to evaluate anomalous data? NASA GiSS uses pair-wise comparisons to homogenise the data. The permutations are phenomenal and comparisons are by a number of iterations, requiring phenomenal computing power. With such poor and variable data a small increase in the data can leave quite large variations in the results. These means manual adjustments are required to eliminate outliers. People with a preconceived view of the world (which may be largely true) will view random outliers differently due to how they conform to that world view, as opposed to how they conform to the data.

  2. I laid out my views on temperature homogenisation a few months ago.
    http://manicbeancounter.com/2015/06/27/defining-temperature-homogenisation/
    However, I did find something that no amount of bias from a competent data analyst can account for in the GISTEMP data set. As well as providing global temperature data, GISTEMP have data for eight zones. The most Southerly is for 64S-90S, covering 5% of the surface area of the globe, and being slightly greater in area than the Antarctic which is from 66°34′S. The problem is with temperature data prior to the mid-1950s, when the first bases were established, It would appear that for the early twentieth century GISTEMP used Base Orcadas located at 60.8 S 44.7 W as a proxy for the region. This had massive variability and general cooling in the early twentieth century, which neatly counters the massive warming in the Arctic.
    As an indicator of how far North 60.8 S lies, consider that Helsinki lies at 60.1N, Lerwick at 60.2N and Reykjavík at 64.1N. All of them would be considered inappropriate temperature trend proxies for the Arctic Ocean.
    http://manicbeancounter.com/2015/05/24/base-orcadas-as-a-proxy-for-early-twentieth-century-antarctic-temperature-trends/

  3. TinyCO2 says:

    You do wonder why there has been no move towards turning climate science into something more professional. Nobody but sceptics are even suggesting it.

Comments are closed.