Geology Reference
In-Depth Information
2.6 Maximum entropy spectral analysis
In traditional power spectral density estimation, the signal is assumed to vanish
outside the finite record for which it is available for sampling. This limits the fre-
quency resolution to the order of the reciprocal of the record length. The discrete
Fourier transform representation of the finite record (2.147), as shown previously, is
periodic in the record length, giving a periodic extension of the time sequence out-
side the finite record. This inserts periodicities in the spectrum, which can only be
suppressed by windowing, at the cost of a further reduction in frequency resolution.
A way out of this dilemma was pointed out by John Parker Burg, who suggested
that, using entropy as a measure of information, the entropy within the measured
finite record be maximised, with no entropy added from any assumption about the
information outside the known record (Burg, 1967, 1968). In this case, the sig-
nal is neither assumed to vanish outside the measured record nor to be a periodic
extension of the signal within the measured record. This has led to the Burg max-
imum entropy algorithm for data adaptive spectral analysis of short records, prom-
ising improved resolution compared to conventional methods of spectral analysis
(Lacoss, 1971; Smylie et al. , 1973).
2.6.1 Information and entropy
A measure of information is the length of the message required to convey it. For
two events, a 1 and a 2 , having equal probability of occurrence, the outcome of a
trial requires only the binary digits 0 and 1 to convey it. For four equally probable
events, a 1 , a 2 , a 3 and a 4 , the outcome of a trial is conveyed by 00, 01, 10 or 11,
requiring two binary digits. For eight equally probable events the outcome of a
trial requires three binary digits, and so on. The probabilities of the occurrence of
a particular event in these cases are 2 , 4 , 8 , and so on. The number of binary digits
required to convey the outcome of a trial follows the rule log 2 (1/ P ), with P the
probability of occurrence of a particular event.
When not all events are equally probable, the amount of information is meas-
ured by
1
P j =−
H
=
P j log 2
P j log 2 P j ,
(2.422)
j
j
a quantity introduced by Shannon (1948), which he called the entropy. The base of
the logarithm used in the definition of entropy depends on the encoding scheme.
For natural logarithms, a change from base r rescales the entropy as
log r
j
1
H
=−
P j log r P j =−
P j log P j .
(2.423)
j
 
Search WWH ::




Custom Search