Biomedical Engineering Reference
In-Depth Information
The mutual information between x and y is defined as
I(
x
,
y
) = H(
x
) + H(
y
) H(
x
,
y
).
(C.46)
When x and y are independent, we have
I(
x
,
y
) =
0 according to Eq. ( C.43 ). Let us
T . Assuming the Gaussian processes for x and y , the mutual
information is expressed as
x T
y T
define z as z
=[
,
]
I(
x
,
y
) = H(
x
) + H(
y
) H(
x
,
y
)
2 log ʣ yy
1
2 log
1
1
2 log
=
| ʣ xx | +
| ʣ zz | .
(C.47)
Here,
| ʣ zz |
is rewritten as
= ʣ yy
ʣ xx ʣ xy ʣ 1
xy ,
ʣ xx ʣ xy
ʣ
T
| ʣ zz | =
yy ʣ
(C.48)
T
xy ʣ yy
where the determinant identity in Eq. ( C.94 ) is used. Substituting Eq. ( C.48 )into
( C.47 ), we have
1
2 log
| ʣ xx |
ʣ xx ʣ xy ʣ 1
xy
I(
x
,
y
) =
T
yy ʣ
1
2 log
1
xy
=
.
(C.49)
ʣ 1
xx ʣ xy ʣ 1
I
yy ʣ
ʣ 1
xx
ʣ xy ʣ 1
xy as
ʣ
ʳ j where j
=
,...,
Let us define the eigenvalues of
1
d and
yy
. 6 Using these eigenvalues, we can derive
d
=
min
{
p
,
q
}
1
2 log
1
xy
I(
x
,
y
) =
ʣ 1
xx ʣ xy ʣ 1
T
I
yy ʣ
d
1
2 log
1
j = 1 (
1
2
1
=
=
log
ʳ j .
(C.50)
1
1
ʳ j )
j = 1
2
Note that
ʳ 1 is equal to the canonical squared correlation
ˁ
c , according to the argu-
ments in Sect. C.3.1 .
When the random vectors x and y are complex-valued, the mutual information for
the complex random vectors is expressed as
log ʣ yy
I(
x
,
y
) = H(
x
) + H(
y
) H(
x
,
y
) =
log
| ʣ xx | +
log
| ʣ zz | .
(C.51)
6 Here, remember that p and q are the sizes of the column vector x and y
 
Search WWH ::




Custom Search