Biomedical Engineering Reference
In-Depth Information
system by the minimal number of bits required to transfer the data. Mathematically,
the information quantity of a random event A is the logarithm of its occurrence
probability ( P A ), that is, log 2 P A . Therefore, the number of bits needed for transfer-
ring N- symbol data ( A i ) with probability distribution { P i , i
=
1, ..., N } is the averaged
information of each symbol:
SE
=−
P
log 2
P
(3.61)
i
i
A straight conclusion from (3.61) is that SE reaches its global maximum under
uniform distribution, that is, SE max =
P N . Therefore, SE
measures the extent to which the probability distribution of a random variable
diverges from a uniform one, and can be implemented to analyze the variation dis-
tribution of physiological signals, such as EEG and electromyogram (EMG).
log 2 ( N ) when P 1 =
P 2 =
...
=
3.3.1.1 Formality of Entropy Implementation in EEG Signal Processing
Entropy has been used in EEG signal analysis in different formalities, including: (1)
approximate entropy ( ApEn ), a descriptor of the changing complexity in embed-
ding space [92, 93]; (2) Kolmogorov entropy ( K 2 ), another nonlinear measure cap-
turing the dynamic properties of the system orbiting within the EEG attractor [94];
(3) spectral entropy, evaluating the energy distribution in wavelet subspace [95] or
uniformity of spectral components [96]; and (4) amplitude entropy, a direct uncer-
tainty measure of the EEG signals in the time domain [97-99]. In applications,
entropy has also been used to analyze spontaneous regular EEG [95, 96], epileptic
seizures [100], and EEG from people with Alzheimer's disease [101] and Parkin-
son's disease [102]. Compared with other nonlinear methods, such as fractal dimen-
sion and Lyapunov components, entropy does not require a huge dataset and, more
importantly, it can be used to investigate the interdependence across the cerebral
cortex [103, 104].
3.3.1.2 Beyond the Formalism of Shannon Entropy
The classic formalism in (3.61) has been shown to be restricted to the domain of
validity of Boltzmann-Gibbs statistics (BGS), which describes a system in which the
effective microscopic interactions and the microscopic memory are of short range.
Such a BGS-based entropy is generally applicable to extensive or additive systems.
For two independent subsystems A and B , their joint probability distribution is
equal to the product of their individual probability, that is,
(
)
( ) ( )
PAB PAPB
ij
∪=
(3.62)
,
i
j
where P i,j ( A B ) is the probability of the combined system A B , and P i ( A ) and
P j ( B ) are the probability distribution of systems A and B , respectively. Combining
(3.62)and (3.61), we can easily conclude additivity in such a combined system:
(
)
( )
( )
SE A
∪=
B
SE A
+
SE B
(3.63)
Search WWH ::




Custom Search