Geology Reference
In-Depth Information
the amount of gain in information [
30
] in reducing uncertainties. Entropy in
information theory can be de
ned as a measure of the degree of uncertainty of
random processes. The history of the entropy concept goes back to Boltzmann [
7
].
Later, Shannon [
64
] gave a probabilistic interpretation in information theory
elaborating the concept.
The entropy function clearly expresses the expected information content or
uncertainty of a probability distribution which can be described as follows. Let Ei
i
stand for an event and pi
i
for the probability of event Ei
i
to occur. Let there be
n events E
1
,
, p
n
(sum of these probabilities would be
one). Since the occurrence of events with smaller probabilities yields more infor-
mation, a measure of information h is included which is a decreasing function of pi.
i
.
A logarithmic function can be used to express information h(pi)
i
)[
64
]:
, E
n
with probabilities p
1
,
…
…
1
p
i
hp
ðÞ
¼log
ð
3
:
11
Þ
which decreases from in
ects
the idea that the lower the probability of an event to occur, the higher the amount of
information in a message stating that the event occurred. In the case of n number of
information values h(pi)
i
), the expected information (entropy) content of a probability
distributions could be derived by weighing the information values h(pi)
i
) by their
respective probabilities:
nity to 0 for pi,
i
, ranging from 0 to 1. The function re
X
n
1
p
i
H
¼
p
i
log
ð
3
:
12
Þ
i¼1
where H stands for entropy.
So,
¼
1
p
i
p
i
log
0fp
i
¼
0
ð
3
:
13
Þ
The entropy value H is non-negative and the minimum possible entropy value is
zero:
¼
1
1
H
min
¼
1
log
0
ð
3
:
14
Þ
The entropy value will be a maximum if all states are equally probable
(i.e., pi
i
¼
1
n
):
X
n
1
n
log
ðÞ
¼n
1
H
max
¼
n
log
ðÞ
¼log
ðÞ
ð
3
:
15
Þ
i¼1