Biomedical Engineering Reference
In-Depth Information
the quadratic form
z
H
Mz
is real-valued. Therefore, we have
z
H
Mz
z
H
Mz
x
T
i
y
T
=
=
(
−
) (
M
c
+
i
M
s
) (
x
+
i
y
)
x
T
M
c
x
x
T
M
s
y
y
T
M
s
x
y
T
M
c
y
=
−
+
+
M
c
−
x
y
M
s
x
T
y
T
T
ʣ
−
1
=[
,
]
=
ˆ
ˆˆ
ˆ
.
(C.26)
M
s
M
c
We now compute
M
, such that
=
−
1
I
xx
=
−
1
−
1
ʣ
yx
ʣ
−
1
ʣ
yx
ʣ
−
1
M
−
i
−
i
.
xx
ʣ
−
1
zz
1
By comparing the above
M
with Eq. (
C.15
) we can see that
=
2
M
, and
therefore, from Eq. (
C.26
), we can prove the relationship
1
2
ˆ
ʣ
−
1
z
H
ʣ
−
1
T
=
ˆˆ
ˆ
.
zz
z
(C.27)
On the basis of Eqs. (
C.17
) and (
C.27
), it is now clear that the real-valued joint
Gaussian distribution in Eq. (
C.11
) and the complex Gaussian distribution in
Eq. (
C.12
) are equivalent. Using exactly the same derivation for Eq. (
C.9
) and ignor-
ing the constants, the entropy for the complex-valued Gaussian is obtained as
H(
z
)
=
log
|
ʣ
zz
|
.
(C.28)
C.3
Canonical Correlation and Mutual Information
C.3.1
Canonical Correlation
This appendix provides a concise explanation on canonical correlation. Let us define
real-valued random vectors
x
and
y
such that
⊡
⊤
⊡
⊤
x
1
x
2
x
p
y
1
y
2
y
q
⊣
⊦
,
⊣
⊦
,
x
=
and
y
=
(C.29)
and consider computing the correlation between
x
and
y
. One way is to compute
correlation coefficients between all combinations of
(
x
i
,
y
j
)
. However, this gives
total
p
q
correlation coefficients and the interpretation of these results may not be
easy. The canonical correlation method first projects column vectors
x
and
y
onto the
×