Geology Reference
In-Depth Information
For random variables that can take on a continuum of values, the sum in the
definition of entropy is replaced by an integral. For example, for realisations of the
time sequence z 0 , z 1 ,..., z N , we would use
f ( z 0 , z 1 ,..., z N )log a 2 N + 2 f ( z 0 , z 1 ,..., z N ) dV
H =−
(2.424)
as a measure of the entropy, where a is a constant with the same units as a dimen-
sion of the sample space, f ( z 0 , z 1 ,..., z N ) is the joint probability density function,
and dV is an element of volume in sample space. If the N
1 realisations of the
time sequence are complex numbers with real and imaginary parts that jointly are
normally distributed with zero mean, the joint probability density function is that
(2.368) given by Wooding (1956),
+
exp
jk z k ,
1
z j c 1
f ( z 0 , z 1 ,..., z N )
=
(2.425)
N +
1
π
|
C
|
where C is the Hermitian variance-covariance matrix,
|
C
|
is its determinant and
c 1
jk are the elements of its inverse. For a stationary process with zero mean, the
elements of the variance-covariance matrix are given by
E z i z j
c ij =
= φ( i
j ),
(2.426)
with φ( i
j ) being the autocorrelation at lag i
j . Then, C
=
T N , where
··· φ(
N )
φ(0)
. . .
. . .
. . .
T N
=
(2.427)
φ( N )
··· φ(0)
is the Toeplitz coe
cient matrix of the prediction error equations (2.80). Substi-
tution of the joint probability density function (2.425) for the Gaussian process
z N
=
( z 0 , z 1 ,..., z N ) into expression (2.424) for the entropy gives
f ( z N )log a 2 N + 2
π
jk z k dV
exp
z j c 1
H =−
N + 1
|
C
|
log a 2 N + 2
π
f ( z N )
jk z k dV
z j c 1
=−
f ( z N ) dV
N +
1
|
C
|
log π
a 2 N + 2
jk
N +
1
|
C
|
c 1
f ( z N ) z j z k dV ,
=
+
(2.428)
since f ( z N ) dV
=
1 by definition. Further,
f ( z N ) z j z k dV = E z k z J
= c kj .
(2.429)
 
Search WWH ::




Custom Search