Biomedical Engineering Reference
In-Depth Information
The Shannon entropy is always positive and it measures the information content
of X , in bits, if the logarithm is taken with base 2.
Next, suppose we have a second discrete random variable Y and that we want to
measure its degree of synchronization with X . We can define the joint entropy as
(
)
=−
XY
XY
IXY
,
p
log
p
(4.5)
ij
ij
ij
,
in which p ij XY
is the joint probability of obtaining an outcome X
=
X i and Y
=
Y i . For
XY
X
Y
independent systems, one has p
=
p p
and therefore, I ( X,Y )
=
I ( X )
I ( Y ) . Then,
ij
i
j
the mutual information between X and Y is defined as
(
) ( ) ( ) (
)
MI X Y
,
=
I X
+
I Y
I X Y
,
(4.6)
The mutual information gives the amount of information of X one obtains by
knowing Y and vice versa. For independent signals, MI ( X,Y ) = 0; otherwise, it takes
positive values with a maximum of MI ( X,Y )
I ( Y ) for identical signals.
Alternatively, the mutual information can be seen as a Kullback-Leibler
entropy, which is an entropy measure of the similarity of two distributions [13, 14].
Indeed, (4.6) can be written in the form
=
I ( X )
=
XY
p
pp
(
)
ij
MI X Y
,
=
p
XY
log
(4.7)
ij
X
Y
i
j
XY
X
Y
, (4.7) is a Kullback-
Leibler entropy that measures the difference between the probability distributions
p ij XY and q i XY . Note that q i XY is the correct probability distribution if the systems are
independent and, consequently, the mutual information measures how different the
true probability distribution p ij XY is from another one in which independence
between X and Y is assumed.
Note that it is not always straightforward to estimate MI from real recordings,
especially since an accurate estimation requires a large number of samples and small
partition bins (a large M ). In particular, for the joint probability densities p ij XY there
will usually be a large number of bins that will not be filled by the data, which may
produce an underestimation of the value of MI . Several different proposals have
been made to overcome these estimation biases whose description is outside the
scope of this chapter. For a recent review, the reader is referred to [15]. In the partic-
ular case of the examples of Figure 4.1, the estimation of mutual information
depended largely on the partition of the stimulus space used [8].
Then, considering a probability distribution q
=
p p
ij
i
j
4.4
Phase Synchronization
All the measures described earlier are sensitive to relationships both in the ampli-
tudes and phases of the signals. However, in some cases the phases of the signals
may be related but the amplitudes may not. Phase synchronization measures are par-
ticularly suited for these cases because they measure any phase relationship between
 
Search WWH ::




Custom Search