Biomedical Engineering Reference
In-Depth Information
E y
Ax y
Ax T
ʣ ee =
E yy T
A T
A T
yx T
Axy T
+ Axx T
=
) A T
) A T
yy T
yx T
A E
xy T
) + A E
xx T
=
E
(
)
E
(
(
(
= ʣ yy ʣ yx A T
ʣ xx A T
A
ʣ xy + A
,
(C.56)
yy T
xy T
where
ʣ yy =
E
(
)
and
ʣ xy =
E
(
)
. Substituting Eq. ( C.55 ) into the equation
above gives,
ʣ ee = ʣ yy ʣ yx ʣ 1
T
xx ʣ
yx .
(C.57)
Let us assume that the random variables x and y follow the Gaussian distribution.
According to Sect. C.1 , we have the relationship,
1
2 log
1
2 log
H(
) =
| ʣ xx |
H(
) =
| ʣ yy | ,
x
and
y
(C.58)
where we ignore constants that are not related to the current arguments. We also have
2 log
y T
E x
y
x T
1
H(
x
,
y
) =
,
2 log
=
2 log
.
T
yx
ʣ yx ʣ yy
1
xx T
xy T
1
E
(
)
E
(
)
ʣ xx ʣ
=
(C.59)
yx T
yy T
E
(
)
E
(
)
Thus, the conditional entropy is expressed as
2 log
yx
ʣ yx ʣ yy
1
ʣ xx ʣ
1
2 log
H(
y
|
x
) = H(
x
,
y
) H(
x
) =
| ʣ xx | .
(C.60)
Using the determinant identity in Eq. ( C.94 ), we finally obtain the formula to compute
the conditional entropy
2 log
1
yx
ʣ yx ʣ yy
1
2 log
ʣ xx ʣ
H(
y
|
x
) =
| ʣ xx |
2 log ʣ yy ʣ yx ʣ 1
yx =
1
1
2 log
T
=
xx ʣ
| ʣ ee | .
(C.61)
The conditional entropy is expressed by the covariance of the residual signal e
obtained by regressing y with x .
Search WWH ::




Custom Search