Biomedical Engineering Reference
In-Depth Information
1.4.2
Mutual information
For a random vector
X
,let
f
X
(
x
)
be its probability density. For two random vec-
,
Y
, denote
H
X
(
)
tors
X
as a measure of the information content of
Y
which is not
contained in
X
. In mathematical terms it is
Y
H
X
(
Y
)=
−
p
(
y
|
x
)
log
p
(
y
|
x
)
d
y
where
p
is the conditional density of
Y
,given
X
. The mutual information be-
tween
X
and
Y
is
(
y
|
x
)
f
f
(
X
,
Y
)
(
,
)
x
y
I
(
X
,
Y
)=
H
(
Y
)
−
H
X
(
Y
)=
)
(
x
,
y
)
log
d
x
d
y
(
X
,
Y
f
X
(
x
)
f
Y
(
y
)
where the information content of
Y
which is also contained in
X
. In other words, the
mutual information is the Kullback-Leibler distance (relative entropy): the distance
between
Y
are treated as independent variables. The mu-
tual information measures the distance between possibly correlated random vectors
(
(
X
,
Y
)
and
X
,
Y
,where
X
,
Y
.
From the definition of mutual information, we would expect that there is a close
relationship between mutual information and correlation. In fact we have the follow-
ing conclusions. If
X
and
Y
are normally distributed random variables, then
X
,
Y
)
and independent random vectors
X
,
1
2
log
r
2
I
(
X
,
Y
)=
−
(
1
−
)
where r is the correlation coefficient between
X
and
Y
.
From recorded neuronal data, to calculate the mutual information between two
random vectors
X
and
Y
is usually not an easy task when one of them is a random
vector in a high dimensional space. To estimate the joint distribution of
X
Y
from
data is already a formidable task in a high dimensional space. See
Chapter 13
for a
detailed account on how how to overcome the difficulties.
,
1.4.3
Fisher information
The Fisher information is introduced from an angle totally different from the Shan-
non information. For a random variable with distribution density
p
(
x
; q
)
,theFisher
information is
∂
p
2
(
x
; q
)
/
∂q
I
(
q
)=
p
(
x
; q
)
d
x
p
(
x
; q
)
where q is the parameter which could be multi-dimensional.
Example 1
Let us assume that
∼
[
]
(
−
/
[
])
,
≥
T
E
T
exp
t
E
T
t
0
where
T
is the interspike interval and
E
[
T
]
is the expectation of
T
. Suppose that
E
[
T
]
depends on a parameter
l. The Fisher information with respect to
l [45] is
Search WWH ::
Custom Search