Agriculture Reference
In-Depth Information
When more detailed farm-level data are
available, a more comprehensive evaluation is
possible. An example is in the evaluation of the
nutrient cycling components of IFSM (Rotz
et al ., 2006). For an experimental farm in the
Netherlands called De Marke, comprehensive
data were available that tracked the flows, trans-
formations, losses and whole farm balances of N
and P. By simulating this farm and comparing
predicted and actual values for each aspect of the
nutrient cycles, a more complete evaluation of
the farm model was made. With this support, the
model was then used to study the effect of mitiga-
tion strategies on farm performance, environ-
mental impact and economics (Rotz et al ., 2006).
For some aspects of the farm, actual data
may not be available or quantifiable. For this sit-
uation, the model may be best evaluated by com-
paring with other models and analyses. An
example is the carbon footprint of milk produc-
tion. This type of footprint cannot be measured.
By comparing the carbon footprints determined
through a number of LCAs with those predicted
by IFSM using similar assumptions as those of
the original analyses, the accuracy of this new
component in the farm model was supported
(Rotz et al ., 2010).
Therefore, the model developer and user must be
as accurate as possible on their assumptions
related to very sensitive parameters. The second
use is an indication of what can be done to
improve system performance. If the result of an
analysis is very sensitive to a certain parameter
or function and those components have been
thoroughly verified, then greater improvements
to the farm can be made by strategic or tactical
changes that affect that part of the system. Farm
production system evaluations with the IFSM
often include a sensitivity analysis. Examples are
the evaluation of high moisture hay preservation
(Rotz et al ., 1992), greenhouse gas emissions
(Chianese et al ., 2009), manure application strat-
egies (Rotz et al ., 2011c), organic dairy produc-
tion (Rotz et al ., 2007) and automatic milking
systems (Rotz et al ., 2003).
Model uncertainty
There are a number of definitions, categoriza-
tions and frameworks for incorporating compo-
nents of uncertainty and variation inherent in
complex models (e.g. Refsgaard, 2000; de
Rocquigny, 2010). Through model verification
and evaluation, the model is confirmed to repre-
sent the natural system adequately, and through
sensitivity analysis, the impacts to the system
caused by intended management changes are
determined. Thus, the remaining uncertainty or
variation in the modelled output is due to changes
in factors that may be known but uncontrollable,
such as the weather; beyond the scope of the
model; at a much finer or coarser level of detail
than appropriate for the model; or simply
unknown. For example, spatial variation in soil
properties and plant health, individual prefer-
ences within a herd, and temporal variations in
weather all impact upon crop yield, animal pro-
duction and nutrient losses to the environment.
With the increased use of models in policy
making, uncertainty analysis is becoming more
desirable and even necessary, given the uncer-
tain nature of models and the systems they rep-
resent. Following the procedure of the IPCC
(2006a), the uncertainty of the whole farm
emission is the square root of the sum of the
squares of the uncertainty of each individual
component. The difficulty becomes that of defin-
ing the uncertainty of individual parameters or
Sensitivity analysis
Sensitivity analysis determines how farm scale
predictions are affected by changes in input
parameters or specific functions used within the
model. This is done by varying important param-
eters a set amount like 10% and quantifying
how this change affects important output
results. The inputs and outputs studied are
defined by the application of the model and vary
with the problem being addressed. Sensitivity is
often quantified as a sensitivity index. This index
is the ratio of the per cent change in a given out-
put over the per cent change in the input. An
index near or greater than 1 indicates that an
output is very sensitive to the given change in
the model parameter or function. An index near
zero indicates a very low sensitivity.
Model sensitivity is useful in two ways. First,
it indicates the amount of error that occurred in
the analysis if there was error in the assumed
parameter (i.e. parameter uncertainty) or in the
function of the model (i.e. structural uncertainty).
Search WWH ::




Custom Search