Environmental Engineering Reference
In-Depth Information
Analogous to entropy in thermodynamics, the entropy of the information reflects the
relative frequencies of possible decision consequences (states) that could be realized if the
decision was implemented a large number of times. The entropy of information is maxi-
mized when the probabilities are equal for all possible consequences and it is minimized
when only one consequence is possible. For the example in Figure 13.10 , the entropy of the
prior probability distribution ( Figure 13.10a ),
095095 005005 020
is smaller than the entropy of the updated prior probability distribution ( Figure 13.10c ),
[(. ]
ln
×
.
[(. ]
ln
×
.
=
.,
0760765 024024 055 This simple example therefore demonstrates
that uncertainty can increase as additional information is obtained. In other words, some-
times we learn we know less than we thought we did.
The importance of the prior probabilities in Bayes' Theorem has motivated the develop-
ment of several approaches for establishing prior probabilities. * The most commonly applied
approach in practice is to rely on subjective information or experience to establish a reason-
able starting point (e.g., Luce and Raiffa 1957). The challenge with this approach is that it
is difficult to account for possibilities beyond our range of experience (e.g., “Black Swan”
events as coined by Taleb 2007). Another approach is to maximize the entropy of the prior
probability distribution in Bayes' Theorem (e.g., Jaynes 1968; Tribus 1969). For example,
the prior probability distribution with maximum entropy for the example in Figure 13.10
would be a uniform distribution with equal probabilities of 0.5 for the two possible conse-
quences. The challenge with this approach is that it is not necessarily rational or consistent.
Maximizing the entropy of the probability distribution for variables in a decision is not
rational because different variables affect a decision in different ways. Journel and Deutsch
(1993) demonstrate this lack of rationality with an example of spatial variability in the
permeability of an oil reservoir. They show that large entropy in the permeability field (i.e.,
little structure or spatial correlation between high and low permeability values) produces
small entropy in the probability distribution for well production, while small entropy in the
permeability field (e.g., high-permeability channels and low-permeability barriers) produces
large entropy in the probability distribution for well production. This approach is also not
consistent because the definition of an input variable is ambiguous. For example, a uniform
probability distribution for permeability does not give a uniform probability distribution for
the logarithm of permeability.
Practical guidance in establishing prior probabilities is to consider the sensitivity of the
preferred decision alternative and the value of information to these prior probabilities.
Figures 13.8 and 13.9 demonstrate an example of implementing this guidance. A proposed
approach to formally implement this guidance is described later in this chapter in the case-
history application of test wells for an energy resource.
[(. ]
ln
×
.
[(. ]
ln
×
.
=
.
.
13.3.2 likelihood functions
The likelihood function, PInformation Decision Consequenc ( ) in Equation 13.7 is impor-
tant because it filters the prior probabilities for decision consequences, potentially amplifying
the probabilities for some possibilities and reducing the probabilities for other possibilities.
The no information and perfect information bounds are controlled by the shape of the like-
lihood function. A likelihood function that has the same probabilities for the information
* To be fair, one perspective is to not use Bayes' Theorem in practice. Classical statisticians such as Fisher (1935)
entirely focus on the likelihood function because of the difficulty they perceive in defensibly establishing prior
probabilities. Keynes (1921) also supports this perspective. While avoiding difficulties with prior probabilities,
this approach is of little practical value because it does not allow for the use of probabilities in making decisions
with uncertain outcomes.
 
Search WWH ::




Custom Search