I’m not surprised that the MSM is no where to be found on ClimateGate, but where are the scientists? Their journals and magazines? Why are they silent especially after this latest bombshell of a discovery:
One can only imagine the angst suffered daily by the co-conspirators, who knew full well that the “Documents” sub-folder of the CRU FOI2009 file contained more than enough probative program source code to unmask CRU’s phantom methodology.
In fact, there are hundreds of IDL and FORTRAN source files buried in dozens of subordinate sub-folders. And many do properly analyze and chart maximum latewood density (MXD), the growth parameter commonly utilized by CRU scientists as a temperature proxy, from raw or legitimately normalized data. Ah, but many do so much more.
Skimming through the often spaghetti-like code, the number of programs which subject the data to a mixed-bag of transformative and filtering routines is simply staggering. Granted, many of these “alterations” run from benign smoothing algorithms (e.g. omitting rogue outliers) to moderate infilling mechanisms (e.g. estimating missing station data from that of those closely surrounding). But many others fall into the precarious range between highly questionable (removing MXD data which demonstrate poor correlations with local temperature) to downright fraudulent (replacing MXD data entirely with measured data to reverse a disorderly trend-line).
In fact, workarounds for the post-1960 “divergence problem”, as described by both RealClimate and Climate Audit, can be found throughout the source code. So much so that perhaps the most ubiquitous programmer’s comment (REM) I ran across warns that the particular module “Uses ‘corrected’ MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”~~~
Clamoring alarmists can and will spin this until they’re dizzy. The ever-clueless mainstream media can and will ignore this until it’s forced upon them as front-page news, and then most will join the alarmists on the denial merry-go-round.
But here’s what’s undeniable: If a divergence exists between measured temperatures and those derived from dendrochronological data after (circa) 1960 then discarding only the post-1960 figures is disingenuous to say the least. The very existence of a divergence betrays a potential serious flaw in the process by which temperatures are reconstructed from tree-ring density. If it’s bogus beyond a set threshold, then any honest men of science would instinctively question its integrity prior to that boundary. And only the lowliest would apply a hack in order to produce a desired result.
And to do so without declaring as such in a footnote on every chart in every report in every study in every book in every classroom on every website that such a corrupt process is relied upon is not just a crime against science, it’s a crime against mankind.
Indeed, miners of the CRU folder have unearthed dozens of email threads and supporting documents revealing much to loathe about this cadre of hucksters and their vile intentions. This veritable goldmine has given us tales ranging from evidence destruction to spitting on the Freedom of Information Act on both sides of the Atlantic. But the now irrefutable evidence that alarmists have indeed been cooking the data for at least a decade may just be the most important strike in human history.
That’s not the only “correction” that was applied to the data….check this out:
In 2 other programs, briffa_Sep98_d.pro and briffa_Sep98_e.pro, the “correction” is bolder by far. The programmer (Keith Briffa?) entitled the “adjustment” routine “Apply a VERY ARTIFICAL correction for decline!!” And he/she wasn’t kidding. Now, IDL is not a native language of mine, but its syntax is similar enough to others I’m familiar with, so please bear with me while I get a tad techie on you.
Here’s the “fudge factor” (notice the brash SOB actually called it that in his REM statement):
yrloc=[1400,findgen(19)*5.+1904] valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7, 2.5,2.6,2.6,2.6,2.6,2.6]*0.75 ; fudge factor
These 2 lines of code establish a 20 element array (yrloc) comprised of the year 1400 (base year but not sure why needed here) and 19 years between 1904 and 1994 in half-decade increments. Then the corresponding “fudge factor” (from the valadj matrix) is applied to each interval. As you can see, not only are temperatures biased to the upside later in the century (though certainly prior to 1960) but a few mid-century intervals are being biased slightly lower. That, coupled with the post-1930 restatement we encountered earlier, would imply that in addition to an embarrassing false decline experienced with their MXD after 1960 (or earlier), CRU’s “divergence problem” also includes a minor false incline after 1930.
Fudge factor indeed….factors that were put in to show warming when it didn’t exist.
I have seen inklings of how bad the CRU code is and how it produces just garbage. It defies the garbage in-garbage out paradigm and moves to truth in-garbage out. I get the feeling you could slam this SW with random numbers and a hockey stick would come out the back end. There is no diurnal corrections for temperature readings, there are all sorts of corrupted, duplicated and stale data, there are filters to keep data that tells the wrong story out, and there are create_fiction sub routines which create raw measurements out of thin air when needed. There are modules which cannot be run for the full temp record because of special code used to ‘hide the decline’.
This is a smoking gun…..but the media is no where to be found and now that our media takes it upon themselves to provide cover for a certain political party and to push an agenda instead of actually reporting the news is it any wonder the MSM is dying a slow death? Look at Ed Driscoll’s, writing about the ACORN story, comparison of how the MSM is treating the email release of these global warming hacks to other releases of sensitive information:
…Orrin Judd spots this staggering moment of hypocrisy from the New York Times’ Andrew C. Revkin of their “Dot Earth” blog on Friday:
The documents appear to have been acquired illegally and contain all manner of private information and statements that were never intended for the public eye, so they won’t be posted here.
And they don’t contain any obvious state military secrets as well, unlike say the Pentagon Papers during the Vietnam War or more recently, the secrets of War on Terror, or any of a number of other leaked documents the Times has cheerfully rushed to print.
Back in 2006, when his paper disclosed the previously confidential details of the SWIFT program, which was designed to trace terrorists’ financial assets, New York Times executive editor Bill Keller said on CBS’s Face the Nation, “one man’s breach of security is another man’s public relations.” Of course, much like the rest of the media circling the wagons with ACORN, it’s not at all surprising that the Times circles the wagons when it’s necessary to save the public face of their fellow liberals.
Incidentally, Tom Maguire explains the perfect way to square the circle:
If Hannah Giles and James O’Keefe are done tormenting ACORN maybe they can figure out how to pose as underaged climate researchers…
If I could, I would only publish emails and documents that were never meant to see the light of day — though, unlike the New York Times, I draw the line at jeopardizing the lives of American troops rather than jeopardizing the contrived “consensus” on global warming.
And of course, the Times has those priorities exactly reversed. But then, for the Gray Lady, small government Republicans are “Stalinists”, but actual totalitarian governments are worthy of emulation and respect.
Curious how the release of sensitive documents is a-ok when it can fit the MSM’s agenda eh?