Biomedical Engineering Reference
In-Depth Information
have the same eigenvalues, according to Property No. 9 in Sect. C.8 . Thus, the
canonical squared correlation can be obtained as the largest eigenvalue of the matrix
ʣ 1
xx
ʣ xy ʣ 1
xy .
ʣ
yy
C.3.2
Mutual Information
Next we introducemutual information, which has a close relationshipwith the canon-
ical correlation under the Gaussianity assumption. We also assume real-valued ran-
dom vectors x and y , and their probability distribution as p
(
x
)
and p
(
y
)
. The entropy
is defined for x and y such that,
p
H(
x
) =−
(
x
)
log p
(
x
)
d x
,
(C.40)
p
H(
y
) =−
(
y
)
log p
(
y
)
d y
.
(C.41)
The entropy is a measure of uncertainty;
H(
x
)
represents the uncertainty when x is
unknown and
H(
y
)
represents the uncertainty when y is unknown. The joint entropy
is defined as
p
H(
,
) =−
(
,
)
(
,
)
,
x
y
x
y
log p
x
y
d x d y
(C.42)
which represents the uncertainty when both x and y are unknown. When x and y are
independent, we have the relationship
p
H(
x
,
y
) =−
(
x
,
y
)
log p
(
x
,
y
)
d x d y
p
=−
(
x
)
p
(
y
) (
log p
(
x
) +
log p
(
y
))
d x d y
= H(
x
) + H(
y
).
(C.43)
The conditional entropy is defined as
p
H(
x
|
y
) =−
(
x
,
y
)
log p
(
x
|
y
)
d x d y
,
(C.44)
which represents the uncertainty when x is unknown, once y is given. We then have
the relationship,
H(
x
,
y
) = H(
x
|
y
) + H(
y
).
(C.45)
The above indicates that the uncertainty when both x and y are unknown is equal to
the uncertainty on x when y is given plus the uncertainty when y is unknown.
Search WWH ::




Custom Search