Biomedical Engineering Reference
In-Depth Information
score low on entropy measures. Reactive EEG patterns occurring during the recov-
ery periods render the entropy measures more sensitive to detecting improvements
in EEG patterns after CA. Therefore, we expect entropy to reduce soon after injury
and at least during the early recovery periods. Entropy should increase with recov-
ery following resuscitation, reaching close to baseline levels of high entropy upon
full recovery.
7.3
Entropy and Information Measures of EEG
The classical entropy measure is the Shannon entropy (SE), which results in useful
criteria for analyzing and comparing probability distribution and provides a good
measure of the information. Calculating the distribution of the amplitudes of the
EEG segment begins with the sampled signal. One approach to create the time series
for entropy analysis is to partition the sampled waveform amplitudes into M seg-
ments. Let us define the raw sampled signal as { x ( k ), for k
1, ..., N }. The amplitude
range A is therefore divided into M disjointed intervals { I i , for i
=
1, ..., M }. The
probability distribution of the sampled data can be obtained from the ratio of the
frequency of the samples N i falling into each bin I i and the total sample number N :
=
pNN
i
=
(7.1)
i
The distribution { p i } of the sampled signal amplitude is then used to calculate
one of the many entropy measures developed [16]. Entropy can then be defined as
M
=− =
(
)
SE
p
log
p
(7.2)
i
i
i
1
This is the definition of the traditional Shannon entropy [11]. Another form of
entropy, postulated in 1988 in a nonlogarithm format by Tsallis, which is also
called Tsallis entropy (TE) [17, 18], is
M
(
)
(
)
1
=
TE
=− −
q
1
p i q
1
(7.3)
i
1
where q is the entropic index defied by Tsallis, which empirically allows us to scale
the signal by varying the q parameter. This method can be quite useful in calculating
entropy in the presence of transients or long-range interactions as shown in [18].
Shannon and Tsallis entropy calculations of different synthetic and real signals are
shown in Figure 7.2. It is clear that the entropy analysis is helpful in discriminating
the different noise signals and EEG brain injury states.
To analyze nonstationary signals such as EEGs after brain injury, the temporal
evolution of SE must be determined from digitized signals (Figure 7.3). So, an alter-
native time-dependent SE measure based on a sliding temporal window technique is
applied [15, 18]. Let { s ( i ), for i
=
1, ..., N } denote the raw sampled signal and set the
sliding temporal window as
(
)
{
( )
}
Wnw
,,
Δ
=
si i
,
= +
1
n
Δ
, ,
w n
+
Δ
of length
w N
(7.4)
 
Search WWH ::




Custom Search