We’re All Neocons Now

I haven’t had time to dissect the speech in real time, but I think that’s the headline, even with the attempted slams at the Bush administration. I could say a lot of other things, like his continuing speech quirk about Pahkeestahn, versus Afganistan (as in Laurel). The teleprompter apparently doesn’t do phonics…

But I think that’s the headline.

He seems to have finally learned that it’s a lot harder to govern than campaign.

[Update a few minute later]

Links to more thoughts. I’m sure I’ll have some as well, after seeing the transcript. That’s always the best way to evaluate The One’s speeches. And politicians’ in general, of course…

It’s Really Quite Simple

I think I’ve found the pseudocode for Mann’s temperature charts:

input hockey_stick array
input year_data array
For each year (1000 - 2009) {
   while (year_data_of_year less than hockey_stick_of_year) {
      if (year_data_of_year less than hockey_stick_of_year) {
         year_data_of_year += 0.1 degrees
   plot year_data_of_year

See, nothing to it. Poor Harry wouldn’t have had so much frustration if he’d just stuck with the script.

Here We Go Again

Remember 2000, when everyone was partying in 1999 as though it were the end of the century and millennium, even though it had another year to go? Well, I was reading the American Airlines magazine this morning, and the monthly column from the airline president said that it was his last of the decade. And of course, Newsweek says that it was a “decade from Hell” (presumably because much of it was presided over by the BusHitler), also implying that it comes to a close at the end of the month. I’m not going to go through the explanation again, but the decade doesn’t end until a year from now.

A Working Scientist’s View

Thoughts on Climaquiddick from Derek Lowe:

I have deep sympathy for the fellow who tried to reconcile the various poorly documented and conflicting data sets and buggy, unannotated code that the CRU has apparently depended on. And I can easily see how this happens. I’ve been on long-running projects, especially some years ago, where people start to lose track of which numbers came from where (and when), where the underlying raw data are stored, and the history of various assumptions and corrections that were made along the way. That much is normal human behavior. But this goes beyond that.

Those of us who work in the drug industry know that we have to keep track of such things, because we’re making decisions that could eventually run into the tens and hundreds of millions of dollars of our own money. And eventually we’re going to be reviewed by regulatory agencies that are not staffed with our friends, and who are perfectly capable of telling us that they don’t like our numbers and want us to go spend another couple of years (and another fifty or hundred million dollars) generating better ones for them. The regulatory-level lab and manufacturing protocols (GLP and GMP) generate a blizzard of paperwork for just these reasons.

But the stakes for climate research are even higher. The economic decisions involved make drug research programs look like roundoff errors. The data involved have to be very damned good and convincing, given the potential impact on the world economy, through both the possible effects of global warming itself and the effects of trying to ameliorate it. Looking inside the CRU does not make me confident that their data come anywhere close to that standard…

But why should we pay any attention to him? He is, after all, one of those Evil Scientists™ in the pay of Big Business, not a noble one trying to save the planet (with millions of dollars in government and left-wing grants).

As a commenter notes, the biggest casualty of this episode is the credibility of science itself. But if it saves us from those trying to save the planet from us, perhaps it’s worth the cost, if it can be regained.

Too Busy To Blog

I’m heading back to LA in the morning from Denver, but in the meantime, if you were wondering who was responsible for Climaquiddick, wonder no more — it was the oil companies. So sayeth the head of the IPCC. But don’t worry, he has some reassuring words for us:

Dr Pachauri dismissed the suggestion that biased research had crept into the IPCC’s most recent report on the science of climate change. A complex system of checks and balances was in place to prevent bias being insinuated into the panel’s work, he said.

Well, that’s certainly a relief. It probably works like those “layers of fact checkers and editors” at the LA Times.

How Wide-Spread Is The Damage?

I’m certainly not familiar with the literature, but I’m sure that a lot of people out there are, and I hope that they’re starting to survey just how far-reaching the destruction of the recent revelations from East Anglia and Happy Valley are to the “settled science” of climate change. In theory, someone could put together a tree of citation dependencies, and see how much of the existing papers are dependent on what we now know to be bogus data and models, either directly or second or third generation. How much original research is there out there that isn’t either derivative from this flawed analysis, or was similarly “pushed” to match it through peer and other pressure?

Until we have the answer to this question, I’m not going to take seriously people who tell me that the vast majority of the work continues to confirm climatic disaster if we don’t immediately put into effect measures to wreck the global economy. And I hope that we can have an answer before Copenhagen. Not that any of the scientific illiterates at that meeting, including Carol Browner, will care.

[Update a few minutes later]

There’re a lot of similar thoughts in comments in a related post by Jonathan Adler.

One other thought, per those comments. I agree with this:

My own sense from reading the emails and the code is perhaps not so much that there was active fraud but rather that there was just a strong pressure to conform results to the desired output combined with a poor understanding of statistical and software methodology.

People will invariably fool themselves if they can. Actually most of the scientific method arguably is designed to prevent people from fooling themselves — from seeing spurious patterns in noisy data. People inexpert in modeling large systems or in the dangers of statistical modeling not only will always find spurious patterns, but will actually believe the patterns exist.

Here, once a certain fairly small critical mass of scientists citing one another’s papers and voting one another grant money is reached, it’s not realistic to expect them to see the problems with their data. Their computer code shows they are desperately trying to get answers they want and need, but they just don’t have the software skills, or statistics skills, or knowledge of large-scale data modeling to do it reliably. And they don’t really want to know either.

Was there some fraud involved? I’m not so sure this is fraud in the classical sense. I think it is more a set of institutional incentives that force researchers to publish or perish, to win grant money or leave academia: the researchers remaining have a certain mercurial stance, combined with a love of the topic but poor statistical analysis and software skills. It’s very easy to understand how they could come to believe they are seeing patterns that are not there.

I’ve called them charlatans, but that’s too harsh. I think they’re true believers in their new religion. But what angers me is when they and their defenders accuse me of being “anti-science” (even sometimes to the degree of lumping me and others in with creationists) when it is they who abandoned science, even if they don’t realize it.

[Sunday morning update]

Mann is going to be investigated by Penn State. As the blogger notes, will it be a real investigation, or a whitewash?

[Sunday evening update]

When you’ve lost the geeks, you’ve lost the war:

Along with a hoard of emails, some source code for the computer climate models was also hacked and released to the public — and the source code is an unusable mess. It doesn’t take expertise in climatology to look at source code and determine that the code is garbage. There are many more geeks with software expertise than with climate expertise, and the geek community will go through every line of code and likely conclude that the computer models are so flawed that any conclusions drawn on them are without merit.

Despite the liberal tendencies of many geeks, I believe that the source code evidence will be insurmountable for most. Some will continue to cling to AGW because of a devotion to left-wing politics, but the majority ofgeeks will abandon their belief, and that abandonment by geeks will truly spell the end for AGW.

I wonder how long it will be before we reach the tipping point at which no one will admit to having been fooled by this nonsense? After the war, it was hard to find a Frenchman who wasn’t in the resistance.

Biting Commentary about Infinity…and Beyond!