Geoscience Reference
In-Depth Information
have put an end to the accusations once and for all. One would be wrong.
Barton Bites Back
The Wegman Report, commissioned by Joe Barton and published several weeks after the NAS report,
seemed a transparent effort to further spread the attacks against our work. It uncritically repeated the
old and tired McIntyre and McKitrick claim that the hockey stick was an artifact of the conventions
used in a statistical (PCA) analysis, while ignoring various already published peer reviewed papers
that refuted that claim. The more extensive and authoritative NAS review, for example, had
specifically dismissed the notion that PCA conventions had any substantial impact on our findings. As
Bloomfield had put it at the NAS press conference, “the committee, while finding that the issues are
real, [found] they had a minimal effect on the final reconstruction.”
The only thing the Wegman Report offered that was actually new was a dubious “network
analysis” of my publishing history with other scientists. Wegman used that analysis to attempt to
demonstrate that there was an ostensible conspiracy in climate science, a tight-knit cabal of coauthors
and peer reviewers, with me at the center. It was like “Six Degrees of Kevin Bacon” applied to
climate scientists. There is a related concept in mathematics, known as the “Erdos number,” which
has a become an informal measure of one's academic status. Paul Erdos was an eccentric, extremely
productive mathematician who coauthored many papers with large numbers of other mathematicians.
The Erdos number represents an author's degree of separation from Erdos in his or her published
work. If you published with Erdos, you have an Erdos number of one. If you published with someone
who published with Erdos, it's two. And so on. (My Erdos number appears to be four. 25 ) As a result
of the Wegman claims, there was now some discussion in the blogosphere of a new measure—the
“Mann number.” 26
It might have seemed amusing, were it not for the thinly veiled accusation at the center of
Wegman's network analysis and the thorough lack of understanding of how modern science is
conducted embodied in the employment of that analysis. Climate science, like many multidisciplinary
fields, requires broad collaboration with researchers across many areas. Any well-published climate
scientist would show a wide-ranging pattern of connection with other researchers in the field. While I
was flattered that Wegman and company seemed to think that I was at the nexus of climate research,
the same type of analysis would have shown a very similar pattern for any of a large number of
climate scientists. Wegman's analysis erroneously conflated our coauthors (who were known) with
our peer reviewers (who, given that they are anonymous, were not). In fact, the two groups likely
overlap little if at at all: Most editors avoid soliciting peer reviews from an author's collaborators.
Wegman's reasoning—that the network analysis revealed me to have been at the center of the climate
research agenda for the past decade, thus somehow undermining any independent review of our
findings—also defied the very principle of causality. The fact that I had worked with such a broad
group of other scientists was largely a product of my earlier work (MBH98/MBH99), which yielded,
among other things, useful benchmarks for later comparisons with the results of theoretical climate
model simulations. It was ridiculous for Wegman to argue that these subsequent collaborations could
somehow have influenced the assessment of earlier work that laid the groundwork for them.
Wegman used the social network analysis to support the bizarre claim that there is “too much
 
 
Search WWH ::




Custom Search