Geoscience Reference
In-Depth Information
example, a 2004 paper, from a team led by Colin O'Dowd from Ireland, showed that
the contribution of organic matter to marine aerosols over the North Atlantic during
plankton blooms made up 63% of the sub-micrometre aerosol mass (aerosols being
a climate forcing agent). However, in winter, when biological activity is at its lowest,
the organic fraction decreases to just 15%. Further, their simulations suggest that
organic matter can enhance cloud-droplet concentration by between 15% and more
than 100% and so is an important component of the aerosol-cloud climate-feedback
system involving marine biota. Such cross-disciplinary research is common in climate
change science.
It must not be forgotten (if the earlier non-biotic example of isentropes was not
enough) that computer modellers also need to continually reappraise the physical
processes they are modelling, especially when new evidence and ideas come along.
For example, at the end of 2004 a team of US geologists (Ufnar et al., 2004) drew
attention to field evidence that mid- and high-latitude rainfall in the mid-Cretaceous
was far higher than today. They suggested that this might have warmed high latitudes,
as ocean evaporation causes cooling and precipitation results in latent heat release,
hence regional warming. In the warmer Cretaceous there would have been far more
evaporation from the tropical oceans and this could provide another mechanism for
transporting heat to high latitudes (in addition to atmospheric and ocean circulation
mechanisms).
Having said all this, while computer models still have a long way to go before
detailed regional forecasts can be made with sufficient confidence for local policy-
makers and planners, a rough-and-ready near-future or even past (historic) global
approximation is another matter. The latter is particularly revealing (and important
in the climate change debate) because running models of the historic climate, during
which meteorological measurements of the real world have been made, means that
the model can be tested against reality. If this reality test works then it is possible
to run the model again but without the factors of anthropogenic climate change. As
we have already noted (see the end of section 5.2.2) this has been done. By 2005
the Hadley team in the UK had a model that successfully captured the vagaries of
the climate from the middle of the 19th century to the early 21st century. Its output
seemingly tracks with reasonable accuracy actual meteorological measurements of
global temperature. Both the Hadley model and real measurements show that the
planet from the middle of the 19th century through to the early 20th century had
a varying climate but the overall trend was more or less steady (an output not too
dissimilar from the relevant years portrayed in Figure 5.2). But after the early 20th
century both the Hadley computer model and real meteorological measurements
show a rising temperature. Clearly, the model works broadly. However, take out the
anthropogenic greenhouse component from the model and its output continues the
19th-century stable trend (albeit fluctuating a bit year to year) through to the present.
As noted, this is good corroborating evidence that human factors are behind the 20th-
and 21st-century warming of climate.
In short, while computer models are a work in progress (as is all science), they
still have considerable value and their outputs can be checked to a certain degree by
looking at the current situation as well as what happened in the past. If they get both
these right then we can have some confidence in their modelling of the future. Indeed,
Search WWH ::




Custom Search