Biomedical Engineering Reference
In-Depth Information
2.3.1.2
Entropy and information measure
The concept of information measure, in statistics called entropy, was introduced
by Shannon [Shannon, 1948]. It is connected with the probability of occurrence of
a given effect. Let us assume that an event has M possibilities, and i th possibility
occurs with probability p i , then the information connected with the occurrence of i th
possibility will be:
=
I i
log p i
(2.27)
The expected value of the information is entropy:
M
i = 1 p i log p i
En
=
(2.28)
Entropy is a measure of uncertainty associated with the outcome of an event. The
higher the entropy, the higher the uncertainty as to which possibility will occur. The
highest value of entropy is obtained when all possibilities are equally likely.
In practice for time series the entropy is calculated from the amplitude distribu-
tions. The amplitude range A of a raw sampled signal is divided into K disjointed
intervals I i ,for i
K . The probability distribution can be obtained from the
ratio of the frequency of the samples N i falling into each bin I i and the total sample
number N :
=
1
,...,
p i
=
N i
/
N
(2.29)
{
}
of the sampled signal amplitude is then used to calculate
entropy measure according to (2.28).
The distribution
p i
2.3.1.3
Autocorrelation function
Correlation function was introduced in Sect. 1.1 by equation (1.11) in terms of
ensemble averaging. Under the assumption of ergodicity, autocorrelation R x
(
τ
)
and
autocovariance functions C x
(
τ
)
are defined by:
Z +
R x
(
τ
)=
x
(
t
)
x
(
t
+
τ
)
dt
(2.30)
and
Z +
C x
(
τ
)=
(
x
(
t
)
μ x
)(
x
(
t
+
τ
)
μ x
)
dt
(2.31)
where τ is the delay, μ x is mean of the signal (equation 1.2). Autocorrelation func-
tion R x
(
τ
)
is always real and symmetric: R x
(
τ
)=
R x
(
τ
)
. It takes maximal value
Ψ 2 (the mean
square value of the signal). Variance σ x is equal to the autocovariance function for
time 0: C x
for τ
=
0. It follows from equations (1.5) and (1.11) that R x
(
τ
=
0
)=
σ x . Autocorrelation of a periodic function is also periodic. Auto-
correlation of noise decreases rapidly with the delay τ ( Figure 2.4) . We can see from
Figure 2.4 c that autocorrelation can help in the extraction of the periodic signal from
noise, even when the noise amplitude is higher than the signal.
(
τ
=
0
)=
 
Search WWH ::




Custom Search