Biomedical Engineering Reference
In-Depth Information
technology as a marker of pharmacological effects in studies of certain drugs, but
there were no reports of the Lifescan having an impact on patient outcome.
9.4.2 Entropy
Most commonly, entropy is considered in the context of physics and thermodynam-
ics where it connotes the energy lost in a system due to disordering. In 1948 Claude
Shannon of Bell Labs developed a theory of information concerned with the effi-
ciency of information transfer [38]. He coined a term known as information
entropy , now known as Shannon entropy , which in our limited context can simply
be considered the amount of information (i.e., bits) per transmitted symbol.
Many different specific algorithms have been applied to calculate various per-
mutations of the entropy concept in biological data (Table 9.1). Recently, a com-
mercial EEG monitor based on the concept of entropy has become available. The
specific entropy algorithm used in the GE Healthcare EEG monitoring system is
described as “time-frequency balanced spectral entropy,” which is nicely described
in an article by Viertiö-Oja et al. [39]. This particular entropy uses both time- and
frequency-domain components. In brief, this algorithm starts EEG data sampling at
400 Hz followed by FFT-derived power spectra derived from several different
length sampling epochs ranging from about 2 to 60 seconds. The spectral entropy,
S , for any desired frequency band ( f 1
f 2 ) is the sum:
f
2
[
]
=
()
1
Sf f
,
=
P f
log
()
Pf
12
n
i
n
i
ff
i
1
where P n ( f i ) is a normalized power value at frequency ƒ i . The spectral entropy of the
band is then itself normalized, S N , to a range of 0 to 1.0 via:
[
]
Sf f
nf f
,
[
]
12
Sff
,
=
(
)
N
12
[
]
log
,
12
where n [ f 1 , f 2 ] is the number of spectral data points in the range f 1 - f 2 . This system
actually calculates two separate, but related entropy values, the state entropy (SE)
and the response entropy (RE). The SE value is derived from the 0.8- to 34-Hz fre-
quency range and uses epoch lengths from 15 to 60 seconds to attempt to emphasize
the relatively stationary cortical EEG components of the scalp signal. The RE, on
the other hand, attempts to emphasize shorter term, higher frequency components
of the scalp signal, generally the EMG and faster cortical components, which rise
and fall faster than the more stationary cortical signals. To accomplish this, the RE
Table 9.1 Entropy Algorithms Applied to EEG Data
Approximate entropy [76-78]
Kolmogorov entropy [79]
Spectral entropy [80, 81]
Lempel-Ziv entropy [80, 82, 83]
Shannon entropy [82, 84]
Maximum entropy [85]
Tsallis entropy [86]
Sample entropy [87]
Wavelet entropy [88, 89]
Time-frequency balanced spectral entropy [39, 40]
 
 
Search WWH ::




Custom Search