Geology Reference
In-Depth Information
One major feature is that entropy applies to qualitative rather than quantitative
values, and, as such, depends exclusively on the probabilities of possible events
[ 73 ]. When prior probabilities pi i are transformed in posterior probabilities qi, i , the
information can be expressed as
X
n
q i
p i
ðÞ ¼
j
ð 3 : 16 Þ
Iq
p
q i log
i¼1
This value equals zero when posterior and prior probabilities are the same (no
information) and is positive otherwise.
Due to the property of additivity of the entropy formula, entropy statistics could
be effectively used to solve problems of aggregation and disaggregation. The
entropy decomposition theorem can be explained as follows. Let Ei i stand again for
an event, and let there be n events E 1 ,
, p n . Assume that
all events can be aggregated into a smaller number of sets of events S 1 ,
, E n with probabilities p 1 ,
, S G in such
a way that each event exclusively falls under one set S g , where g =1,
, G.
The probability that the event falling under S g occurs is obtained by summation:
X
p g ¼
p i
ð 3 : 17 Þ
i 2 S g
The entropy at the level of sets of events can be expressed as
X
G
1
P g
H 0 ¼
P g log
ð 3 : 18 Þ
g
¼
1
where H 0 is called the between-group entropy. The entropy decomposition theorem
speci
es the relationship between the between-group entropy H 0 at the level of sets
and the entropy H at the level of events as de
ned in ( 3.12 ). Therefore,
X
G
H
¼
H 0 þ
P g H g
ð 3 : 19 Þ
g¼1
where
!
X
p i
p g log
1
p i p g
H g ¼
g
¼
1
; ...;
G
ð 3 : 20 Þ
i
2
S g
In the above equation, the probability pi i /P g, i
S g is the conditional probability of
E i , knowing that one of the events falling under S g is bound to occur. H g thus stands
for the entropy within the set S g and the term
2
P g H g in ( 3.19 ) is the average within-
group entropy [ 26 ].
Search WWH ::




Custom Search