Wednesday, July 11, 2007

Global Warming: Rutherford-Appleton vs. Duke

The BBC reports a new study disputing any link between the sun and modern climate change. The organization interviewed one of the researchers, Mike Lockwood of the UK's Rutherford-Appleton Laboratory, who uttered the dangerous words: "This should settle the debate."

Calling anything about this debate settled (beyond the plain facts that the earth is warming and that human activity has played a part in it to some degree) is just asking for trouble. And in looking at the early commentary on the story, I found, via Smoke If You Got 'Em, some 2005 research from Duke University that would seem to poke some holes in the study as described in the BBC story.

The major underpinning of the new study is that the sun's output has decreased over the past twenty years--since about 1985--and thus doesn't mesh with rising temperatures. However, in examining the report on the Duke study, we find some problems with that assumption:

According to Scafetta, records of sunspot activity suggest that solar output has been rising slightly for about 100 years. However, only measurements of what is known as total solar irradiance gathered by satellites orbiting since 1978 are considered scientifically reliable, he said.

But observations over those years were flawed by the space shuttle Challenger disaster, which prevented the launching of a new solar output detecting satellite called ACRIM 2 to replace a previous one called ACRIM 1.

That resulted in a two-year data gap that scientists had to rely on other satellites to try to bridge. "But those data were not as precise as those from ACRIM 1 and ACRIM 2,” Scafetta said in an interview.

Nevertheless, several research groups used the combined satellite data to conclude that that there was no increased heating from the Sun to contribute to the global surface warming observed between 1980 and 2002, the authors wrote in their paper.

Lacking a standardized, uninterrupted data stream measuring any rising solar influence, those groups thus surmised that all global temperature increases measured during those years had to be caused by solar heat-trapping "greenhouse" gases such as carbon dioxide, introduced into Earth's atmosphere by human activities, their paper added.

But a 2003 study by a group headed by Columbia's Richard Willson, principal investigator of the ACRIM experiments, challenged the previous satellite interpretations of solar output. Willson and his colleagues concluded, rather that their analysis revealed a significant upward trend in average solar luminosity during the period.

If the new study is wrong about what the sun's been doing, then it's little more than a waste of paper, since its methodology is a direct comparison of solar output and cosmic ray intensity to the average global surface temperature. (It's probably also worth noting that if the researchers are working with measurements of solar output from the last 40 years, as the BBC story claims, then a little over a quarter of their data came from before 1978. Go back and reread the first paragraph of the above quote to see the significance of that.)

The Duke researchers, using the Columbia group's estimates of solar output as the base for their calculations, concluded that the sun was responsible for a minimum of 10-30% of the increase in the planet's surface temperature between 1980 and 2002.

Who's right? I'm hardly in a position to say one way or the other. I am, however, in a position to say that the debate is far from settled.


No comments: