Saturday, January 23, 2010

The CRU Was Not Alone In Manipulating Data

In earlier posts we have looked at and compared the various data sets for temperature and I noted a couple of things. Firstly when we compared the two satellite based data sets they were very comparable with each other, with very little divergence.



Yet when we compare the UAH satellite based data set with the CRU linked HadCRUT series we immediately notice a divergence between the two with the HadCRUT series showing more warming. This is because HadCRUT "adjust" their raw data.



The following article from Marc Shepard of the American Thinker entitled: Climategate: CRU Was But The Tip Of The Iceberg looks in detail at the data sets created by the National Climate Data Centre (which is part of NOAA) and the NASA GISS data set GISTEMP which is administered by the grandfather of global warming James Hansen. As can be seen when looking at the following comparisons between the UAH satellite data set and firstly NCDC and the secondly GISTEMP there is considerable divergence between them because of the adjustments made by these organisations to their raw data. More detail on these graphs can be obtained here.



Marc Shepard explains:

Not surprisingly, the blatant corruption exposed at Britain’s premiere climate institute was not contained within the nation’s borders. Just months after the Climategate scandal broke, a new study has uncovered compelling evidence that our government
s principal climate centers have also been manipulating worldwide temperature data in order to fraudulently advance the global warming political agenda.

Not only does the preliminary report [
PDF] indict a broader network of conspirators, but it also challenges the very mechanism by which global temperatures are measured, published, and historically ranked.

Last Thursday, Certified Consulting Meteorologist Joseph D’Aleo and computer expert E. Michael Smith appeared together on KUSI TV [Five Part Series Can Be Seen Via Here] to discuss the Climategate -- American Style scandal they had discovered. This time out, the alleged perpetrators are the National Oceanic and Atmospheric Administration (NOAA) and the NASA Goddard Institute for Space Studies (GISS).

NOAA stands accused by the two researchers of strategically deleting cherry-picked, cooler-reporting weather observation stations from the temperature data it provides the world through its National Climatic Data Center (NCDC). D’Aleo explained to show host and Weather Channel founder John Coleman that while the Hadley Center in the U.K. has been the subject of recent scrutiny, “[w]e think NOAA is complicit, if not the real ground zero for the issue.” And their primary accomplices are the scientists at GISS, who put the altered data through an even more biased regimen of alterations, including intentionally replacing the dropped NOAA readings with those of stations located in much warmer locales.

As you’ll soon see, the ultimate effects of these statistical transgressions on the reports which influence climate alarm and subsequently world energy policy are nothing short of staggering.

NOAA – Data In / Garbage Out

Although satellite temperature measurements have been available since 1978, most global temperature analyses still rely on data captured from land-based thermometers, scattered more or less about the planet. It is that data which NOAA receives and disseminates – although not before performing some sleight-of-hand on it.

Smith has done much of the heavy lifting involved in analyzing the NOAA/GISS data and software, and he chronicles his often frustrating experiences at his fascinating website. There, detail-seekers will find plenty to satisfy, divided into easily-navigated sections -- some designed specifically for us “geeks,” but most readily approachable to readers of all technical strata.

Perhaps the key point discovered by Smith was that by 1990, NOAA had deleted from its datasets all but 1,500 of the 6,000 thermometers in service around the globe.

Now, 75% represents quite a drop in sampling population, particularly considering that these stations provide the readings used to compile both the Global Historical Climatology Network (GHCN) and United States Historical Climatology Network (USHCN) datasets. These are the same datasets, incidentally, which serve as primary sources of temperature data not only for climate researchers and universities worldwide, but also for the many international agencies using the data to create analytical temperature anomaly maps and charts.

Yet as disturbing as the number of dropped stations was, it is the nature of NOAA’s “selection bias” that Smith found infinitely more troubling.

It seems that stations placed in historically cooler, rural areas of higher latitude and elevation were scrapped from the data series in favor of more urban locales at lower latitudes and elevations. Consequently, post-1990 readings have been biased to the warm side not only by selective geographic location, but also by the anthropogenic heating influence of a phenomenon known as the Urban Heat Island Effect (UHI).

For example, Canada’s reporting stations dropped from 496 in 1989 to 44 in 1991, with the percentage of stations at lower elevations tripling while the numbers of those at higher elevations dropped to one. That’s right: As Smith wrote in his blog, they left one thermometer for everything north of LAT 65. And that one resides in a place called Eureka, which has been described as “The Garden Spot of the Arctic” due to its unusually moderate summers.

Smith also discovered that in California, only four stations remain – one in San Francisco and three in Southern L.A. near the beach and he rightly observed that

It is certainly impossible to compare it with the past record that had thermometers in the snowy mountains. So we can have no idea if California is warming or cooling by looking at the USHCN data set or the GHCN data set.

That’s because the baseline temperatures to which current readings are compared were a true averaging of both warmer and cooler locations. And comparing these historic true averages to contemporary false averages which have had the lower end of their numbers intentionally stripped out – will always yield a warming trend, even when temperatures have actually dropped.

Overall, U.S. online stations have dropped from a peak of 1,850 in 1963 to a low of 136 as of 2007. In his blog, Smith wittily observed that “the Thermometer Langoliers have eaten 9/10 of the thermometers in the USA[,] including all the cold ones in California.” But he was deadly serious after comparing current to previous versions of USHCN data and discovering that this “selection bias” creates a +0.6°C warming in U.S. temperature history. And no wonder -- imagine the accuracy of campaign tracking polls were Gallup to include only the replies of Democrats in their statistics.

But it gets worse.


Prior to publication, NOAA effects a number of “adjustments” to the cherry-picked stations’ data, supposedly to eliminate flagrant outliers, adjust for time of day heat variance, and “homogenize” stations with their neighbors in order to compensate for discontinuities. This last one, they state, is accomplished by essentially adjusting each to jive closely with the mean of its five closest “neighbors.” But given the plummeting number of stations, and the likely disregard for the latitude, elevation, or UHI of such neighbors, it’s no surprise that such “homogenizing” seems to always result in warmer readings.

The chart below is from Willis Eschenbach’s WUWT essay, The smoking gun at Darwin Zero, and it plots GHCN Raw versus homogeneity-adjusted temperature data at Darwin International Airport in Australia. The “adjustments” actually reversed the 20th-century trend from temperatures falling at 0.7°C per century to temperatures rising at 1.2°C per century.
Eschenbach isolated a single station and found that it was adjusted to the positive by 6.0°C per century, and with no apparent reason, as all five stations at the airport more or less aligned for each period. His conclusion was that he had uncovered “indisputable evidence that the ‘homogenized’ data has been changed to fit someone’s preconceptions about whether the earth is warming.”


WUWTs editor, Anthony Watts, has calculated the overall U.S. homogeneity bias to be 0.5°F to the positive, which alone accounts for almost one half of the 1.2°F warming over the last century. Add Smith’s selection bias to the mix and poofactual warming completely disappears!

Yet believe it or not, the manipulation does not stop there.

GISS – Garbage In / Globaloney Out

The scientists at NASA’s GISS are widely considered to be the world’s leading researchers into atmospheric and climate changes. And their Surface Temperature (GISTemp) analysis system is undoubtedly the premiere source for global surface temperature anomaly reports. In creating its widely disseminated maps and charts, the program merges station readings collected from the Scientific Committee on Antarctic Research (SCAR)
with GHCN and USHCN data from NOAA.

It then puts the merged data through a few “adjustments” of its own.

First, it further “homogenizes” stations, supposedly adjusting for UHI by (according to NASA) changing “the long term trend of any non-rural station to match the long term trend of their rural neighbors, while retaining the short term monthly and annual variations.” Of course, the reduced number of stations will have the same effect on GISS’s UHI correction as it did on NOAA’s discontinuity homogenization – the creation of artificial warming.

Furthermore, in his communications with me, Smith cited boatloads of problems and errors he found in the Fortran code written to accomplish this task, ranging from hot airport stations being mismarked as “rural” to the “correction” having the wrong sign (+/-) and therefore increasing when it meant to decrease or vice-versa.

And according to NASA, “If no such neighbors exist or the overlap of the rural combination and the non-rural record is less than 20 years, the station is completely dropped; if the rural records are shorter, part of the non-rural record is dropped.”

However, Smith points out that a dropped record may be “from a location that has existed for 100 years.” For instance, if an aging piece of equipment gets swapped out, thereby changing its identification number, the time horizon reinitializes to zero years. Even having a large enough temporal gap (e.g., during a world war) might cause the data to “just get tossed out.” But the real chicanery begins in the next phase, wherein the planet is flattened and stretched onto an 8,000-box grid, into which the time series are converted to a series of anomalies (degree variances from the baseline). Now, you might wonder just how one manages to fill 8,000 boxes using 1,500 stations.

Here’s NASA’s solution:
For each grid box, the stations within that grid box and also any station within 1200km of the center of that box are combined using the reference station method.

Even on paper, the design flaws inherent in such a process should be glaringly obvious. So it’s no surprise that Smith found many examples of problems surfacing in actual practice. He offered me Hawaii for starters. It seems that all of the Aloha State’s surviving stations reside in major airports. Nonetheless, this unrepresentative hot data is what’s used to “infill” the surrounding “empty” Grid Boxes up to 1200 km out to sea. So in effect, you have “jet airport tarmacs ‘standing in’ for temperature over water 1200 km closer to the North Pole.”

An isolated problem? Hardly, reports Smith.

From KUSI’s Global Warming: The Other Side:
“There’s a wonderful baseline for Bolivia -- a very high mountainous country -- right up until 1990 when the data ends. And if you look on the [GISS] November 2009 anomaly map, you’ll see a very red rosy hot Bolivia [boxed in blue]. But how do you get a hot Bolivia when you haven’t measured the temperature for 20 years?”

Of course, you already know the answer: GISS simply fills in the missing numbers – originally cool, as Bolivia contains proportionately more land above 10,000 feet than any other country in the world
with hot ones available in neighboring stations on a beach in Peru or somewhere in the Amazon jungle.

Remember that single station north of 65° latitude which they located in a warm section of northern Canada? Joe D’Aleo explained its purpose: “To estimate temperatures in the Northwest Territory [boxed in green above], they either have to rely on that location or look further south.”

Pretty slick, huh?

And those are but a few examples. In fact, throughout the entire grid, cooler station data are dropped and “filled in” by temperatures extrapolated from warmer stations in a manner obviously designed to overestimate warming...

...And convince you that it’s your fault.

Government and Intergovernmental Agencies -- Globaloney In / Green Gospel Out

Smith attributes up to 3°F (more in some places) of added “warming trend” between NOAA’s data adjustment and GIStemp processing.

That’s over twice last century’s reported warming.

And yet, not only are NOAA’s bogus data accepted as green gospel, but so are its equally bogus hysterical claims, like this one from the 2006 annual State of the Climate in 2005 [PDF]: “Globally averaged mean annual air temperature in 2005 slightly exceeded the previous record heat of 1998, making 2005 the warmest year on record.”

And as D’Aleo points out in the preliminary report, the recent NOAA proclamation that June 2009 was the second-warmest June in 130 years will go down in the history books, despite multiple satellite assessments ranking it as the 15th-coldest in 31 years.

Even when our own National Weather Service (NWS) makes its frequent announcements that a certain month or year was the hottest ever, or that five of the warmest years on record occurred last decade, they’re basing such hyperbole entirely on NOAA’s warm-biased data. And how can anyone possibly read GISS chief James Hansen’s Sunday claim that 2009 was tied with 2007 for second-warmest year overall, and the Southern Hemisphere’s absolute warmest in 130 years of global instrumental temperature records, without laughing hysterically? It's especially laughable when one considers that NOAA had just released a statement claiming that very same year (2009) to be tied with 2006 for the fifth-warmest year on record.

So how do alarmists reconcile one government center reporting 2009 as tied for second while another had it tied for fifth? If you’re WaPo’s Andrew Freedman, you simply chalk it up to “different data analysis methods” before adjudicating both NASA and NOAA innocent of any impropriety based solely on their pointless assertions that they didn’t do it.

Earth to Andrew: “Different data analysis methods”? Try replacing “analysis” with “manipulation,” and ye shall find enlightenment. More importantly, does the explicit fact that since the drastically divergent results of both “methods” can’t be right, both are immediately suspect somehow elude you?

But by far the most significant impact of this data fraud is that it ultimately bubbles up to the pages of the climate alarmists’ bible: The United Nations Intergovernmental Panel on Climate Change Assessment Report.

And wrong data begets wrong reports, which particularly in this case begets dreadfully wrong policy.

It’s High Time We Investigated the Investigators

The final report will be made public shortly, and it will be available at the websites of both report-supporter Science and Public Policy Institute and Joe D’Aleo’s own ICECAP. As they’ve both been tremendously helpful over the past few days, I’ll trust in the opinions I’ve received from the report’s architects to sum up.

This from the meteorologist:
The biggest gaps and greatest uncertainties are in high latitude areas where the data centers say they 'find' the greatest warming (and thus which contribute the most to their global anomalies). Add to that no adjustment for urban growth and land use changes (even as the world's population increased from 1.5 to 6.7 billion people) [in the NOAA data] and questionable methodology for computing the historical record that very often cools off the early record and you have surface based data sets so seriously flawed, they can no longer be trusted for climate trend or model forecast assessment or decision making by the administration, congress or the EPA.

Roger Pielke Sr. has suggested: “...that we move forward with an inclusive assessment of the surface temperature record of CRU, GISS and NCDC. We need to focus on the science issues. This necessarily should involve all research investigators who are working on this topic, with formal assessments chaired and paneled by mutually agreed to climate scientists who do not have a vested interest in the outcome of the evaluations.” I endorse that suggestion.
Certainly, all rational thinkers agree. Perhaps even the mainstream media, most of whom have hitherto mistakenly dismissed Climategate as a uniquely British problem, will now wake up and demand such an investigation.

And this from the computer expert:
That the bias exists is not denied. That the data are too sparse and with too many holes over time in not denied. Temperature series programs, like NASA GISS GIStemp try, but fail, to fix the holes and the bias. What is claimed is that "the anomaly will fix it." But it cannot. Comparison of a cold baseline set to a hot present set must create a biased anomaly. It is simply overwhelmed by the task of taking out that much bias. And yet there is more. A whole zoo of adjustments are made to the data. These might be valid in some cases, but the end result is to put in a warming trend of up to several degrees. We are supposed to panic over a 1/10 degree change of "anomaly" but accept 3 degrees of "adjustment" with no worries at all. To accept that GISTemp is "a perfect filter". That is, simply, "nuts". It was a good enough answer at Bastogne, and applies here too.
Smith, who had a family member attached to the 101st Airborne at the time, refers to the famous line from the 101st commander, U.S. Army General Anthony Clement McAuliffe, who replied to a German ultimatum to surrender the December, 1944 Battle of Bastogne, Belgium with a single word: “Nuts.”

And that’s exactly what we’d be were we to surrender our freedoms, our economic growth, and even our simplest comforts to duplicitous zealots before checking and double-checking the work of the prophets predicting our doom should we refuse.

Further analysis of the Bolivia and Arctic GISTemp effects can be seen
here.

So as we can see amongst the warmist data sets, the tendancy to "adjust their data" is commonplace. The adjustments are invariably up in nature. It explains the graphs.


No Peer Review Required At IPCC

In a follow up to my posts on Climategate and Glaciergate(which the IPCC has issued a (sort of) apology for) is this article by Ben Pile on Dr. Roger Pielke Jr's website entitled: More Laundered Literature . In it he examines yet another example of the IPCC citing Non Peer Reviewed, non scientific research from a single source and presenting it as settled consensus science. The topic in question this time relates to a claim made by Oxfam in a published report claiming:

According to the IPCC, climate change could halve yields from rain-fed crops in parts of Africa as early as 2020, and put 50 million more people worldwide at risk of hunger. [Pg. 2]

Upon further investigation they found the following in the IPCC report:

In other [African] countries, additional risks that could be exacerbated by climate change include greater erosion, deficiencies in yields from rain-fed agriculture of up to 50% during the 2000-2020 period, and reductions in crop growth period (Agoumi, 2003). [IPCC WGII, Page 448. 9.4.4]

Oxfam cite the IPCC, but the citation belongs to Agoumi. The IPCC reference his study properly:

Agoumi, A., 2003: Vulnerability of North African countries to climatic changes: adaptation and implementation strategies for climatic change. Developing Perspectives on Climate Change: Issues and Analysis from Developing Countries and Countries with Economies in Transition. IISD/Climate Change Knowledge Network, 14 pp. (PDF).

There is only limited discussion of “deficiencies in yields from rain-fed agriculture” in that paper, and its focus is not ‘some’ African countries, but just three: Morocco, Tunisia, and Algeria. It is not climate research. It is a discussion about the possible effects of climate change. All that the report actually says in relation to the IPCC quote, is that,
Studies on the future of vital agriculture in the region have shown the following risks, which are linked to climate change:
• greater erosion, leading to widespread soil degradation;
• deficient yields from rain-based agriculture of up to 50 per cent during the 2000–2020 period;
• reduced crop growth period;
Most interestingly, the study was not simply produced by some academic working in some academic department, for publication in some peer-reviewed journal. Instead, it was published by The International Institute for Sustainable Development (IISD). According to the report itself,
The International Institute for Sustainable Development contributes to sustainable development by advancing policy recommendations on international trade and investment, economic policy, climate change, measurement and indicators, and natural resource management. By using Internet communications, we report on international negotiations and broker knowledge gained through collaborative projects with global partners, resulting in more rigorous research, capacity building in developing countries and better dialogue between North and South.
Oxfam takes its authority from the IPCC. The IPCC report seemingly takes its authority from a bullet point in a paper published by an organisation with a declared political interest in the sustainability agenda that was the brainchild of former Canadian Prime Minister Brian Mulroney in 1988.

That the IPCC is citing non-peer-reviewed, non-scientific research from quasi governmental semi-independent sustainability advocacy organisations must say something about the dearth of scientific or empirical research. The paper in question barely provides any references for its own claims, yet by virtue of merely appearing in the IPCC’s 2007 AR4 report, a single study, put together by a single researcher, becomes “consensus science”.

The situation is simply insane. The IPCC are cited as producers of official science, yet they often appear to take as many liberties with the sources they cite, as those who cite the IPCC – such as Oxfam – go on to do. To ask questions about this process is to stand against ‘the consensus’, to be a ‘denier’, and to be willingly jeopardising the future of millions of people, and inviting the end of the world.

The popular view of the climate debate and politics is that the IPCC and scientists produce the science, which politicians and policymakers respond to, encouraged by NGOs, all reported on by journalists. But as the seemingly unfounded claims about the Himalayan glaciers and the North African water shortages show, this is a misconception. Science, the media, government, NGOs and supra-national political organisations do not exist as sharply distinct institutions. They are nebulous and porous. They merge, and each influences the interpretation and substance of the next iteration of their own product. The distinction between science and politics breaks down in the miasma.

If this process could be mapped, it would be no surprise if it was discovered that the IPCC was found to be citing itself through citing NGOs and Quasi-NGOs, and other non-peer-reviewed, not scientific literature. This is the real climate feedback mechanism.

In addition to this is the latest revelation of IPCC misinformation via the U.K.'s TimesOnline website, entitled: UN wrongly linked global warming to natural disasters. In it it states:

The United Nations climate science panel faces new controversy for wrongly linking global warming to an increase in the number and severity of natural disasters such as hurricanes and floods.

It based the claims on an unpublished report that had not been subjected to routine scientific scrutiny — and ignored warnings from scientific advisers that the evidence supporting the link too weak. The report's own authors later withdrew the claim because they felt the evidence was not strong enough.

The claim by the Intergovernmental Panel on Climate Change (IPCC), that global warming is already affecting the severity and frequency of global disasters, has since become embedded in political and public debate. It was central to discussions at last month's Copenhagen climate summit, including a demand by developing countries for compensation of $100 billion (£62 billion) from the rich nations blamed for creating the most emissions.

...The new controversy also goes back to the IPCC's 2007 report in which a separate section warned that the world had "suffered rapidly rising costs due to extreme weather-related events since the 1970s".

It suggested a part of this increase was due to global warming and cited the unpublished report, saying: "One study has found that while the dominant signal remains that of the significant increases in the values of exposure at risk, once losses are normalised for exposure, there still remains an underlying rising trend."

The Sunday Times has since found that the scientific paper on which the IPCC based its claim had not been peer reviewed, nor published, at the time the climate body issued its report.

When the paper was eventually published, in 2008, it had a new caveat. It said: "We find insufficient evidence to claim a statistical relationship between global temperature increase and catastrophe losses."

Despite this change the IPCC did not issue a clarification ahead of the Copenhagen climate summit last month. It has also emerged that at least two scientific reviewers who checked drafts of the IPCC report urged greater caution in proposing a link between climate change and disaster impacts — but were ignored.

The claim will now be re-examined and could be withdrawn.

...The academic paper at the centre of the latest questions was written in 2006 by Robert Muir-Wood, head of research at Risk Management Solutions, a London consultancy, who later became a contributing author to the section of the IPCC's 2007 report dealing with climate change impacts. He is widely respected as an expert on disaster impacts.

Muir-Wood wanted to find out if the 8% year-on-year increase in global losses caused by weather-related disasters since the 1960s was larger than could be explained by the impact of social changes like growth in population and infrastructure.

...Muir-Wood was, however, careful to point out that almost all this increase could be accounted for by the exceptionally strong hurricane seasons in 2004 and 2005. There were also other more technical factors that could cause bias, such as exchange rates which meant that disasters hitting the US would appear to cost proportionately more in insurance payouts.

Despite such caveats, the IPCC report used the study in its section on disasters and hazards, but cited only the 1970-2005 results.

The IPCC report said: "Once the data were normalised, a small statistically significant trend was found for an increase in annual catastrophe loss since 1970 of 2% a year." It added: "Once losses are normalised for exposure, there still remains an underlying rising trend."

Muir-Wood's paper was originally commissioned by Roger Pielke, professor of environmental studies at Colorado University, also an expert on disaster impacts, for a workshop on disaster losses in 2006. The researchers who attended that workshop published a statement agreeing that so far there was no evidence to link global warming with any increase in the severity or frequency of disasters. Pielke has also told the IPCC that citing one section of Muir-Wood's paper in preference to the rest of his work, and all the other peer-reviewed literature, was wrong.

He said: "All the literature published before and since the IPCC report shows that rising disaster losses can be explained entirely by social change. People have looked hard for evidence that global warming plays a part but can't find it. Muir-Wood's study actually confirmed that."

More Information can be gotten directly from Dr. Roger Pielke Jr.'s blog. It would now appear that the lack of peer review uncovered by the Glaciergate disclosure is indeed widespread throughout the IPCC's 4th Assessment Report.

Despite this Professor Jean-Pascal van Ypersele, a climatologist at the Universite Catholique de Louvain in Belgium, who is vice-chair of the IPCC, said: "We are reassessing the evidence and will publish a report on natural disasters and extreme weather with the latest findings. Despite recent events the IPCC process is still very rigorous and scientific."

I'm not sure who the Professor is now trying to convince more, the increasingly sceptical public or himself.

Glaciergate And The TERI Link

It seems that Climategate wasn't the only scandal to develop over the last few months. The latest involves the rigorous peer reviewed science (or complete lack there of) the IPCC applied to its claim that Himalayan glaciers would completely disappear by 2035 or sooner if the current rate of melting due of course to man made global warming was to continue. Here is the quote from chapter 10 of the IPCC's 4thAssessment Report:

Glaciers in the Himalaya are receding faster than in any other part of the world (see Table 10.9) and, if the present rate continues, the likelihood of them disappearing by the year 2035 and perhaps sooner is very high if the Earth keeps warming at the current rate. Its total area will likely shrink from the present 500,000 to 100,000 km2 by the year 2035 (WWF, 2005).

This was used at both governmental and non-governmental level as a reason to justify introducing Emissions Trading. As can be seen here for both the Garnaut Report and Government white paper responses and here for Australian Religious Response to Climate Change as an example of a non governmental one.

In addition to this new scandal is another involving the relationship of the IPCC"s top climate official Dr Rajendra Pachauri (who I have written about before) has with The Energy and Resources Institute (TERI), based in New Delhi, which is India's most influential private body involved in climate-change issues and renewable energy and the millions of dollars he is apparently making by incorporating both his role as a top climate official with business opportunities that he can direct TERI's way in that capacity.

Let's first look at the new scandal known as Glaciergate:

From an article (one of many that have now been written on this) from the U.K.'s Express newspaper is this explanation:

FRESH doubts were cast over controversial global warming theories yesterday after a major climate change argument was discredited.

The International Panel on Climate Change was forced to admit its key claim that Himalayan glaciers would melt by 2035 was lifted from a 1999 magazine article. The report was based on an interview with a little-known Indian scientist who has since said his views were “speculation” and not backed up by research.

It was also revealed that the IPCC’s controversial chairman, Dr Rajendra Pachauri, described as “the world’s top climate scientist”, is a former railway engineer with a PhD in economics and no formal climate science qualifications.

Dr Pachauri was yesterday accused of a conflict of interest after it emerged he has a network of business interests that attract millions of pounds in funding thanks to IPCC policies. One of them, The Energy Research Institute, has a London office and is set to receive up to £10million from British taxpayers over the next five years in the form of grants from the Department for International Development.

Dr Pachauri denies any conflict of interest arising from his various roles.

Yesterday, critics accused the IPCC of boosting the man-made global warming theory to protect a multi-million pound industry.

Climate scientist Peter Taylor said: “I am not surprised by this news. A vast bureaucracy and industry has been built up around this theory. There is too much money in it for the IPCC to let it wither.”

Professor Julian Dowdeswell, a glacier specialist at Cambridge University, said: “The average glacier is 1,000ft thick so to melt one even at 15ft a year would take 60 years. That is a lot faster than anything we are seeing now so the idea of losing it all by 2035 is unrealistically high.”

The IPCC was set up by the UN to ensure world leaders had the best possible scientific advice on climate change. It issued the glacier warning in a benchmark report in 2007 that was allegedly based on the latest research into global warming.

The scientists behind the report now admit they relied on a news story published in the New Scientist journal in 1999. The article was based on a short telephone interview with scientist Syed Hasnain, then based in Delhi, who has since said his views were “speculation”.

The New Scientist report was picked up by the WWF and included in a 2005 paper.

It then became a key source for the IPCC which went further in suggesting the melting of the glaciers was “very likely”.

Yesterday, Professor Murari Lal who oversaw the chapter on glaciers in the IPCC report, said: “If Hasnain says officially that he never asserted this, or that it is a wrong presumption, then I will recommend that the assertion about Himalayan glaciers be removed from future IPCC assessments.”

Last year the Indian government issued its own scientific research rejecting the notion that glaciers were melting so rapidly.

Before the weakness in the IPCC’s research was exposed, Dr Pachauri dismissed the Indian government report as “voodoo science”.

The revelations are the latest crack to appear in the scientific consensus on climate change.

It follows the so-called climate-gate scandal in November last year when leaked emails from the University of East Anglia’s Climatic Research Unit appeared to show scientists fiddling the figures to strengthen the case for man-made climate change.

The scandal prompted critics to suggest that many scientists had a vested interest in promoting climate change because it helped secure more funding for research.....

So what we have instead of a rigorous peer review process is a journalist interviewing a scientist who gives a wildly speculative view on a topic. This journalist prints this speculation as fact in New Scientist magazine. This is then picked up by the WWF (note that they are the sited source by the IPCC in the quote given above from the 4th AR) who wrote:

glaciers in the Himalayas are receding faster than in any other part of the world and, if the present rate continues, the livelihood[sic] of them disappearing by the year 2035 is very high

From there we have the IPCC disregarding any rigorous peer review process and incorporating it as "fact" and "settled science" into their 4th Assessment Report. Yet we were constantly told that the IPCC is the "gold standard" for climate science and all of its science is peer reviewed. Apparently not! Dr. Joanne Nova has a good summary of this and other areas the IPCC has missed the golden mark as can be seen here.

As we saw in the original article I referenced in this section was this quote:

Dr Pachauri was yesterday accused of a conflict of interest after it emerged he has a network of business interests that attract millions of pounds in funding thanks to IPCC policies.

The original article by Christopher Booker and Richard North of the U.K. Telegraph says:

No one in the world exercised more influence on the events leading up to the Copenhagen conference on global warming than Dr Rajendra Pachauri, chairman of the UN’s Intergovernmental Panel on Climate Change (IPCC) and mastermind of its latest report in 2007.

Although Dr Pachauri is often presented as a scientist (he was even once described by the BBC as “the world’s top climate scientist”), as a former railway engineer with a PhD in economics he has no qualifications in climate science at all.

What has also almost entirely escaped attention, however, is how Dr Pachauri has established an astonishing worldwide portfolio of business interests with bodies which have been investing billions of dollars in organisations dependent on the IPCC’s policy recommendations.

These outfits include banks, oil and energy companies and investment funds heavily involved in ‘carbon trading’ and ‘sustainable technologies’, which together make up the fastest-growing commodity market in the world, estimated soon to be worth trillions of dollars a year.

Today, in addition to his role as chairman of the IPCC, Dr Pachauri occupies more than a score of such posts, acting as director or adviser to many of the bodies which play a leading role in what has become known as the international ‘climate industry’.

It is remarkable how only very recently has the staggering scale of Dr Pachauri’s links to so many of these concerns come to light, inevitably raising questions as to how the world’s leading ‘climate official’ can also be personally involved in so many organisations which stand to benefit from the IPCC’s recommendations.

...The original power base from which Dr Pachauri has built up his worldwide network of influence over the past decade is the Delhi-based Tata Energy Research Institute, of which he became director in 1981 and director-general in 2001. Now renamed The Energy Research Institute, TERI was set up in 1974 by India’s largest privately-owned business empire, the Tata Group, with interests ranging from steel, cars and energy to chemicals, telecommunications and insurance (and now best-known in the UK as the owner of Jaguar, Land Rover, Tetley Tea and Corus, Britain’s largest steel company).

Although TERI has extended its sponsorship since the name change, the two concerns are still closely linked.

In India, Tata exercises enormous political power, shown not least in the way that when it expressed its interests in developing land in the eastern states of Orissa and Jarkhand, it led to the Indian government displacing hundreds of thousands of poor tribal villagers to make way for large-scale iron mining and steelmaking projects.

Initially, when Dr Pachauri took over the running of TERI in the 1980s, his interests centred on the oil and coal industries, which may now seem odd for a man who has since become best known for his opposition to fossil fuels. He was, for instance, a director until 2003 of India Oil, the country’s largest commercial enterprise, and until this year remained as a director of the National Thermal Power Generating Corporation, its largest electricity producer.

In 2005, he set up GloriOil, a Texas firm specialising in technology which allows the last remaining reserves to be extracted from oilfields otherwise at the end of their useful life.

However, since Pachauri became a vice-chairman of the IPCC in 1997, TERI has vastly expanded its interest in every kind of renewable or sustainable technology, in many of which the various divisions of the Tata Group have also become heavily involved, such as its project to invest $1.5 billion (£930 million) in vast wind farms.

Dr Pachauri’s TERI empire has also extended worldwide, with branches in the US, the EU and several countries in Asia. TERI Europe, based in London, of which he is a trustee (along with Sir John Houghton, one of the key players in the early days of the IPCC and formerly head of the UK Met Office) is currently running a project on bio-energy, financed by the EU.

Another project, co-financed by our own Department of Environment, Food and Rural Affairs and the German insurance firm Munich Re, is studying how India’s insurance industry, including Tata, can benefit from exploiting the supposed risks of exposure to climate change. Quite why Defra and UK taxpayers should fund a project to increase the profits of Indian insurance firms is not explained.

Even odder is the role of TERI’s Washington-based North American offshoot, a non-profit organisation, of which Dr Pachauri is president. Conveniently sited on Pennsylvania Avenue, midway between the White House and the Capitol, this body unashamedly sets out its stall as a lobbying organisation, to “sensitise decision-makers in North America to developing countries’ concerns about energy and the environment”.

TERI-NA is funded by a galaxy of official and corporate sponsors, including four branches of the UN bureaucracy; four US government agencies; oil giants such as Amoco; two of the leading US defence contractors; Monsanto, the world’s largest GM producer; the WWF (the environmentalist campaigning group which derives much of its own funding from the EU) and two world leaders in the international ‘carbon market’, between them managing more than $1 trillion (£620 billion) worth of assets.

All of this is doubtless useful to the interests of Tata back in India, which is heavily involved not just in bio-energy, renewables and insurance but also in ‘carbon trading’, the worldwide market in buying and selling the right to emit CO2. Much of this is administered at a profit by the UN under the Clean Development Mechanism (CDM) set up under the Kyoto Protocol, which the Copenhagen treaty was designed to replace with an even more lucrative successor.

...It is one of these deals, reported in last week’s Sunday Telegraph, which is enabling Tata to "mothball" nearly three million tonnes of steel production at its Corus plant in Redcar, while opening a new plant in Orissa with a similar scale of production, gaining in the process a potential £1.2 billion in ‘carbon credits’ (while putting 1,700 people on Teesside out of work).

More than three-quarters of the world ‘carbon’ market benefits India and China in this way. India alone has 1,455 CDM projects in operation, worth $33 billion (£20 billion), many of them facilitated by Tata – and it is perhaps unsurprising that Dr Pachauri also serves on the advisory board of the Chicago Climate Exchange, the largest and most lucrative carbon-trading exchange in the world, which was also assisted by TERI in setting up India’s own carbon exchange.

But this is peanuts compared to the numerous other posts to which Dr Pachauri has been appointed in the years since the UN chose him to become the world’s top ‘climate-change official’.

In 2007, for instance, he was appointed to the advisory board of Siderian, a San Francisco-based venture capital firm specialising in ‘sustainable technologies’, where he was expected to provide the Fund with ‘access, standing and industrial exposure at the highest level’,

In 2008 he was made an adviser on renewable and sustainable energy to the Credit Suisse bank and the Rockefeller Foundation. He joined the board of the Nordic Glitnir Bank, as it launched its Sustainable Future Fund, looking to raise funding of £4 billion. He became chairman of the Indochina Sustainable Infrastructure Fund, whose CEO was confident it could soon raise £100 billion.

In the same year he became a director of the International Risk Governance Council in Geneva, set up by EDF and E.On, two of Europe’s largest electricity firms, to promote ‘bio-energy’. This year Dr Pachauri joined the New York investment fund Pegasus as a ‘strategic adviser’, and was made chairman of the advisory board to the Asian Development Bank, strongly supportive of CDM trading, whose CEO warned that failure to agree a treaty at Copenhagen would lead to a collapse of the carbon market.

The list of posts now held by Dr Pachauri as a result of his new-found world status goes on and on. He has become head of Yale University’s Climate and Energy Institute, which enjoys millions of dollars of US state and corporate funding. He is on the climate change advisory board of Deutsche Bank. He is Director of the Japanese Institute for Global Environmental Strategies and was until recently an adviser to Toyota Motors. Recalling his origins as a railway engineer, he is even a policy adviser to SNCF, France’s state-owned railway company.

Meanwhile, back home in India, he serves on an array of influential government bodies, including the Economic Advisory Committee to the prime minister, holds various academic posts and has somehow found time in his busy life to publish 22 books.

...But the real question mark over TERI’s director-general remains over the relationship between his highly lucrative commercial jobs and his role as chairman of the IPCC.

TERI have, for example, become a preferred bidder for Kuwaiti contracts to clean up the mess left by Saddam Hussein in their oilfields in 1991. The $3 billion (£1.9 billion) cost of the contracts has been provided by the UN. If successful, this would be tenth time TERI have benefited from a contract financed by the UN.

Certainly no one values the services of TERI more than the EU, which has included Dr Pachauri’s institute as a partner in no fewer than 12 projects designed to assist in devising the EU’s policies on mitigating the effects of the global warming predicted by the IPCC.

In a very convenient and no doubt profitable arrangement Dr. Pachauri and TERI are now "cleaning up" around the world. There are even reports that Dr. Pachauri used his TERI email account to conduct official IPCC business furthering the calls of "Conflict of Interest."

Dr. Pachauri responded to the Booker/North article as can be seen in this follow up article by Christopher Booker:

In a series of press and television interviews, Dr Pachauri described our report as "a pack of lies". He accused us of being part of that same "powerful vested interest" responsible for "Climategate", the emails and other documents leaked from the East Anglia Climatic Research Unit, which revealed the methods used by the small group of scientists at the heart of the IPCC to manipulate temperature data to show that the earth has been warming further than is justified by the evidence.

When asked whether he intended to take legal action over our article, Dr Pachauri replied that he hadn't yet made up his mind.

In typical fashion of an alarmist note Dr. Pachauri's instinctive reaction to criticism: to deny and abuse but not to fight back with facts.

Sunday, January 17, 2010

Those Who Control The Information Try To Control The Debate.

The rise and exposure of Climategate did more than just show the email correspondence of a few climate scientists who were determined to shut down dissent, manipulate the peer review process hide or destroy information requested under FOI, hide their mistakes, etc, etc. They also exposed the bias of information sources like Wikipedia (as we have seen before), as well as exposing the bias of Google as a web browser. National Post journalist Lawrence Solomon investigated these phenomenon and this is what he found: In the first article entitled Wikipedia's Climate Doctor we find:

The Climategate Emails describe how a small band of climatologists cooked the books to make the last century seem dangerously warm.

...The Climategate Emails reveal something else, too: the enlistment of the most widely read source of information in the world — Wikipedia — in the wholesale rewriting of this history.

...But the UN’s official verdict that the Medieval Warm Period had not existed did not erase the countless schoolbooks, encyclopedias, and other scholarly sources that claimed it had. Rewriting those would take decades, time that the band members didn’t have if they were to save the globe from warming. Instead, the band members turned to their friends in the media and to the blogosphere, creating a website called RealClimate.org. “The idea is that we working climate scientists should have a place where we can mount a rapid response to supposedly ‘bombshell’ papers that are doing the rounds” in aid of “combating dis-information,” one email explained, referring to criticisms of the hockey stick and anything else suggesting that temperatures today were not the hottest in recorded time. One person in the nine-member Realclimate.org team — U.K. scientist and Green Party activist William Connolley — would take on particularly crucial duties. Connolley took control of all things climate in the most used information source the world has ever known – Wikipedia. Starting in February 2003, just when opposition to the claims of the band members were beginning to gel, Connolley set to work on the Wikipedia site. He rewrote Wikipedia’s articles on global warming, on the greenhouse effect, on the instrumental temperature record, on the urban heat island, on climate models, on global cooling. On Feb. 14, he began to erase the Little Ice Age; on Aug.11, the Medieval Warm Period. In October, he turned his attention to the hockey stick graph. He rewrote articles on the politics of global warming and on the scientists who were skeptical of the band. Richard Lindzen and Fred Singer, two of the world’s most distinguished climate scientists, were among his early targets, followed by others that the band especially hated, such as Willie Soon and Sallie Baliunas of the Harvard-Smithsonian Center for Astrophysics, authorities on the Medieval Warm Period. All told, Connolley created or rewrote 5,428 unique Wikipedia articles. His control over Wikipedia was greater still, however, through the role he obtained at Wikipedia as a website administrator, which allowed him to act with virtual impunity. When Connolley didn’t like the subject of a certain article, he removed it — more than 500 articles of various descriptions disappeared at his hand. When he disapproved of the arguments that others were making, he often had them barred — over 2,000 Wikipedia contributors who ran afoul of him found themselves blocked from making further contributions. Acolytes whose writing conformed to Connolley’s global warming views, in contrast, were rewarded with Wikipedia’s blessings. In these ways, Connolley turned Wikipedia into the missionary wing of the global warming movement. The Medieval Warm Period disappeared, as did criticism of the global warming orthodoxy. With the release of the Climategate Emails, the disappearing trick has been exposed. The glorious Medieval Warm Period will remain in the history books, perhaps with an asterisk to describe how a band of zealots once tried to make it disappear.

This was followed up by the article entitled
Climategate at Wikipedia in which he further highlights:

Since my Saturday column described how Wikipedia editors have been feverishly rewriting climate history over much of the decade, fair-minded Wikipedians have been doing their best to correct the record. No sooner than they remove gross distortions, however, than the distortions are replaced. William Connolley, a Climategate member and Wikipedia’s chief climate change propagandist, remains as active as ever.


How does Wikipedia work and how does Connolley and his co-conspirators exercise control? Take Wikipedia’s page for Medieval Warm Period, as an example. In the three days following my column’s appearance, this page alone was changed some 50 times in battles between Connolley’s crew and those who want a fair presentation of history.

One of the battles concerns the so-called hockey stick graphs, which purport to show that temperatures over the last 2000 years were fairly stable until the last century, when temperatures rose rapidly to today’s supposedly dangerous and unprecedented levels. In these graphs, the Medieval Warm Period – a period of several centuries around the year 1000 – appears to be a modest bump along the way. Before the hockey stick graphs began to be published about a decade ago, scientists everywhere – including those associated with the UN itself – viewed the Medieval Warm Period as much hotter than today. Rather than appearing as a modest bump compared to today’s high temperatures, the Medieval Warm Period looked more like a mountain next to the molehill that is today’s temperature increase.

The hockey stick graphs led to an upheaval in scientific understanding when the UN reversed itself and declared them bona fide. Soon after, the hockey stick graphs were shown to be bogus by a blue-chip panel of experts assembled by the US Congress. The Climategate Emails confirm the blue-chip panel’s assessment – we now know that Climategate scientists themselves doubted the reliability of the hockey stick graphs.

With the hockey stick graphs so thoroughly discredited, you’d think they would become a footnote to a discussion of the Medieval Warm Period, or an object of amusement and curiosity. But no, on the Wikipedia page for the Medieval Warm Period, the hockey stick graph appears prominently at the top, as if it is settled science.

Because the hockey stick graph has become an icon of deceit and in no way an authority worthy of being cited, fair-minded Wikipedians tried to remove the graph from the page, as can be seen here. Exactly two minutes later, one of Connelley’s associates replaced the graph, restoring the page to Connelley’s original version, as seen here.

Battles like this occurred on numerous fronts, until just after midnight on Dec 22, when Connolley reimposed his version of events and, for good measure, froze the page to prevent others from making changes -- and to prevent the public, even in two-minute windows, from realizing that today’s temperatures look modest in comparison to those in the past. In the World of Wikipedia, seen as here, the hockey stick graph, and Connolley’s version of history, still rules.


This bias by Wikipedia was not isolated. In fact the Google readjustment (read hiding) of the
number of articles referring to Climategate became known as "Googlegate," such was the level of interference. In Solomon's article Better off with Bing he writes:

This week, Google announced an end to its long-standing collaboration with the Chinese Communists — it will no longer censor users inside China.


That’s good of it. Maybe Google will now also stop using its search engine to censor the rest of us, in the Western countries.

Search for “Googlegate” on Google and you’ll get a paltry result (my result yesterday was 29,300). Search for “Googlegate” on Bing, Microsoft’s search engine competitor, and the result numbers an eye-popping 72.4 million. If you’re a regular Google user, as opposed to a Bing user, you might not even know that “Googlegate” has been a hot topic for years in the blogosphere — that’s the power that comes of being able to control information.

Saturday, January 16, 2010

Cloud Mystery Video Series

In an earlier post I made reference to a series of videos called the cloud mystery and I showed you the preview to it. Well I have since discovered the full six part series and will post it here for you to view:

Part 1



Part 2



Part 3



Part 4



Part 5



Part6


They are thought provoking and worth a watch. They are also a companion to this report written by Dr. Roy Spencer of UAH entitled: Clouds Dominate CO2 as a Climate Driver Since 2000. In it he says:

The main point I am making here is that, no matter whether you assume the climate system is sensitive or insensitive, our best satellite measurements suggest that the climate system is perfectly capable of causing internally-generated radiative forcing larger than the “external” forcing due to increasing atmospheric carbon dioxide concentrations. Low cloud variations are the most likely source of this internal radiative forcing.

... If one additionally entertains the possibility that there is still considerable “warming still in the pipeline” left from increasing CO2, as NASA’s Jim Hansen claims, then the need for some natural cooling mechanism to offset and thus produce no net warming becomes even stronger. Either that, or the climate system is so insensitive to increasing CO2 that there is essentially no warming left in the pipeline to be realized. (The less sensitive the climate system, the faster it reaches equilibrium when forced with a radiative imbalance.)

Any way you look at it, the evidence for internally-forced climate change is pretty clear. Based upon this satellite evidence alone, I do not see how the IPCC can continue to ignore internally-forced variations in the climate system. The evidence for its existence is there for all to see, and in my opinion, the IPCC’s lack of diagnostic skill in this matter verges on scientific malpractice.