Geoscience Reference
In-Depth Information
change? To address such questions, both new and existing techniques have been
used. In this chapter, we describe these techniques before considering answers to
the questions.
Downscaling
An essential first step in predicting the effects of future climates on aquatic
ecosystems is to forecast what these climates are likely to be. On a global scale,
future climate change is modelled using GCMs, which are mechanistic models of
the climate system built on physical principles (IPCC 2007). Assumptions about
greenhouse gas emissions, population growth and economic development have also
to be made, and within Euro-limpacs, a standardized set of assumptions is used
based on the Special Report on Emission Scenarios (SRES) of the IPCC (Naki´enovi´
et al . 2000). These scenarios are explained in Chapter 3. GCMs are currently too
coarse in resolution (~270 km × 270 km) for catchment-scale modelling, though
finer-scale models are close to release. Methods are therefore required to 'down-
scale' the outputs from the GCMs to the appropriate scale for modelling effects.
This is more problematic than might be imagined. There are two main approaches,
variously called dynamic or model-based and statistical or empirical (Fowler et al .
2007). Dynamic downscaling uses regional climate models (RCMs) nested within
the GCMs, which are used to provide input data and boundary conditions.
RCMs can simulate processes important on catchment scales and provide
outputs on scales down to about 5km. These are computationally expensive,
however, and a more common approach is to use statistical downscaling methods.
These rely on observed quantitative relationships between the small-scale climates
and the large-scale climates. These relationships are then used to generate the
large-scale or high-resolution climate from the GCM output, one major
assumption being that the empirical relationships will remain the same in all
projected climates, including those affected by enhanced greenhouse warming.
Tisseuil et al . (2009) discuss further problems and refinements of statistical
downscaling methods.
In Euro-limpacs, we standardized downscaling methods. Dynamically
downscaled data across Europe were available from the EU-funded PRUDENCE
(Prediction of Regional scenarios and Uncertainties for Defining EuropeaN
Climate change risks and Effects, 2001-04) website (http://prudence.dmi.dk), for
the periods 1961-90 and 2071-2100. The data were generated by nesting an
RCM within two GCMs, but the output cell size (0.5° × 0.5°) was still too coarse
for most catchment applications and required further downscaling using the
Statistical Downscaling Method (SDSM; Wilby et al . 2002) with refinements
based on 'local methods', as described by Wade et al . (2008). For instance, GCM
and RCM temperature predictions are for the average altitude of a grid cell. To
correct this to the altitude of a catchment, a lapse rate (the rate of change of
temperature with altitude) based correction was proposed, preferably using a site-
specific lapse rate or alternatively a 'standard' lapse rate of −0.6 °C per 100 m.
In some instances, it was appropriate to use a GCM cell different from that in
which the site lies to build a relationship between GCM or RCM output and
local conditions. For example, if the site is in a mountainous region, then a
Search WWH ::




Custom Search