Environmental Engineering Reference
In-Depth Information
More than a century later, the French mathematician
and physicist Laplace deeply influenced philosophy of
science with his thoughts about determinism, as detailed
(somewhat ironically) in his treatise of probability theory
(Laplace, 1820):
departure from reality if the parameter estimates are
not accurate.
Probability theory helps to alleviate epistemic uncer-
tainty. Instead of a single (approximated) solution, the
probabilistic approach provides a manifold of equally
probable solutions reflecting the many possible values (or
distributions) of the unknown parameters. Of course, all
but at most one of the drawn values or distributions are
wrong, as is also almost surely the aforementioned deter-
ministic estimate. Yet, the manifold of plausible solutions
is not aimed at perfectly reflecting reality. Instead, it is the
diversity of solutions that constitutes a richer composite
information - a probability law over the set of plausi-
ble solutions. Statistical moments can then be extracted
from that probabilistic law, such as the mean value (the
expectation), the most frequent value (the mode), the
quantification of the uncertainty associated with this
expectation (the variance) or, in a general sense, a full
probability density distribution. Hence, a probabilistic
model aims at capturing both the average response of
the system and the variability due to uncertainties of any
kind. Producing a reliable picture of this law requires
some information and a suitable probabilistic represen-
tation of the underlying unknown parameters. These
are respectively problems of stochastic modelling and of
statistical inference:
We ought to regard the present state of the universe
as the effect of its antecedent state and as the cause of
the state that is to follow. An intelligence knowing all
the forces acting in nature at a given instant, as well as
the momentary positions of all things in the universe,
would be able to comprehend in one single formula the
motions of the largest bodies as well as the lightest atoms
in the world, provided that its intellect were sufficiently
powerful to subject all data to analysis; to it nothing
would be uncertain, the future as well as the past would
be present to its eyes.
Reinterpreting the idea by Laplace, stochastic methods
can hence be seen as a complement to determinis-
tic modelling in the case where 'some' parameters are
unknown - epistemic uncertainty as opposed to aleatory
or natural uncertainty (Agarwal, 2008). Following the
development of statistical mechanics by Boltzman at the
end of the nineteenth century, the rise of Planck and
Bohr's quantum physics (Bohr, 1961) has given a new
legitimacy to randomness in the natural sciences during
the twentieth century, illustrated in the first place by
Heisenberg's famous uncertainty principle (Reichenbach,
1944). Beyond epistemic uncertainty, it becomes sensible
to assume that there exists an irreducible randomness
in the behaviour of matter. To that Einstein replies that
'God does not play dice with the universe' (Broglie, 1953).
To be clear, there is no room for uncertainty. We prefer
not to enter into this debate here and do not distinguish
what is unpredictable from what is unknown but could
be predicted with more information. Coming back to
the groundwater-flow example, it is now clear that even
with the finest mathematical and physical description
of the aquifer and the best computing facilities, mod-
ellers cannot predict the groundwater flow exactly unless
the perfect knowledge of the aquifer and its physical
parameters is available (which is, indeed, never the case
in practice). Some field (or laboratory) measurements
of the governing parameters are usually available and
some expert knowledge is always inherited from prior
studies. Thus, modellers can still use equation solvers,
despite some parameters are unfortunately not known
with accuracy. These parameters have to be guessed or
estimated. Plugging these estimated parameters in yields a
unique solution. Yet, this solution may display a dramatic
Stochastic modelling assumes that the unknown param-
eters have been generated by some random mechanism,
and strive to mathematically characterize or partially
describe this mechanism - see de Marsily (1994). The
latter can be achieved for instance by assuming some
parametric multivariate statistical distribution for the
set of unknown parameters (in broad sense, of all input
variables including, for example, boundary and initial
conditions defining the mathematical model).
Statistical inference aims at estimating the parameters
of a stochastic model on the basis of observed data.
This phase, which is deeply interwoven with the cho-
sen stochastic model and the available measurements,
has inspired numerous research works of reference
(Fisher, 1990) and is still a controversial issue nowadays
(Berger, 1985) (see also Chapter 7).
In Earth Sciences, Matheron (1989) pointed out the
difficulty of building a suitable model and making appro-
priate statistical inferences based only on some observa-
tions taken from a unique realization of the phenomenon
(assumed to be generated at random). Indeed, how could
one come back to Bernouilli's law of 'heads or tails'
Search WWH ::




Custom Search