Geology Reference
In-Depth Information
X
X
X
n
m
n
p : j
p ij
H Y X
ðÞ ¼
p : j H Y j X
ðÞ ¼
p ij log
ð 3 : 28 Þ
j
¼
1
i
¼
1
j
¼
1
X
X
X
m
m
n
p i :
p ij
H X ðÞ ¼
p i : H X i ðÞ ¼
p ij log
ð 3 : 29 Þ
i¼1
i¼1
j¼1
The above equations show that the average conditional entropy never exceeds the
unconditional entropy, i.e., H X Y
[ 26 , 73 ].
In the same way, the mutual information on two events can be expressed as
ðÞ
HY
ðÞ
and H Y X
ðÞ
HX
ðÞ
X
X
m
n
p ij
p i : :
JX
ð
;
Y
Þ ¼
p ij log
ð 3 : 30 Þ
p j
i¼1
j¼1
One can
find different notations such as M(X, Y)orT(X, Y) in the literature for
( 3.30 ). From the above equations, one can deduce that JX
ð
;
Þ
Y
0 and
ð
;
Þ ¼
ðÞ
ðÞ
JX
. After further analysis, one can say that the multi-
dimensional entropy equals the sum of the marginal entropies minus the mutual
information:
Y
HY
H X Y
H
ð
X
;
Y
Þ ¼
H
ð
X
Þþ
H
ð
Y
Þ
J
ð
X
;
Y
Þ
ð 3 : 31 Þ
Some researchers make the interpretation that, when mutual information is absent,
the marginal distributions are independent and their entropies add up to the total
entropy.
The above definitions can be extended to the multivariate case with M variables
[ 31 ]. The total entropy of independent variables X m (m =1,
, M)is
X
M
H
ð
X 1 ; X 2 ; ...;
X m Þ ¼
HX ðÞ
3 : 32 Þ
m¼1
If the variables are dependent, their joint entropy can be expressed as
X
M
H
ð
X 1 ; X 2 ; ...;
X m Þ ¼
H
ð
X 1 Þþ
H
ð
X m j
X 1 ;
X 2 ; ...;
X m 1 Þ
ð 3 : 33 Þ
m
¼
2
Finally, when the multivariate normal distribution is assumed for f (x 1 , x 2 ,
, x M ),
the joint entropy of X, with X being the vector of M variables, can be expressed as
ln 2
ln j
M
2
1
2
M
2
H
ð
x
Þ ¼
P þ
C
M ln D
ðÞ 3 : 34 Þ
x
where C
jj
= determinant of the covariance matrix C and
Δ
x = class interval size,
Search WWH ::




Custom Search