Biomedical Engineering Reference
In-Depth Information
L
ʱ
= ʠ xy ʲ ʽ 1 ʱ =
0
,
(C.36)
L
ʲ
xy
= ʠ
ʱ ʽ 2 ʲ =
0
,
(C.37)
ʠ xy = ʣ 1 / 2
ʣ xy ʣ 1 / 2
T
where
. Using the constraint
ʱ
ʱ =
1, and left multiplying
xx
yy
T
ʱ
to Eq. ( C.36 )gives
T
ʱ
ʠ xy ʲ = ʽ 1 .
T
T
Using the constraint
ʲ
ʲ =
1, and left multiplying
ʲ
to Eq. ( C.37 )gives
T
xy
ʲ
ʠ
ʱ = ʽ 2 .
ʽ 1 = ʽ 2 = ˁ c .
Since the left-hand sides of the two equations above are equal, we have
Also, using Eq. ( C.37 ), we get
1
ˁ c ʠ
T
ʲ =
xy ʱ ,
and substituting the equation above into Eq. ( C.36 ), we get
xy
T
2
ʠ xy ʠ
ʱ = ˁ
c ʱ .
(C.38)
c
This equation indicates that the squared canonical correlation
ˁ
is obtained as the
xy , and the corresponding eigenvector is the solution
eigenvalues of a matrix
ʠ xy ʠ
xy
for the vector
ʱ
. Note that
ʠ xy ʠ
is a real symmetric matrix, the eigenvectors
ʱ
and
ʲ
are real-valued, and therefore a and b are real-valued because
ʣ xx and
ʣ yy
are real-valued matrices.
Denoting the eigenvalues in Eq. ( C.38 )as
μ j where j
=
1
,...,
d and d
=
min
{
p
,
q
}
, the canonical correlation between the two sets of random variables
x 1 ,...,
μ 1 , which is the
best overall measure of the association between x and y . However, other eigenvalues
may provide complementary information on the linear relationship between those
two sets of random variables. The mutual information described in Sect. C.3.2 is a
measure that can take all the eigenvalues into account.
Also, it is worth mentioning that the matrices,
x p and y 1 ,...,
y q is obtained as the largest eigenvalue
T
xy = ʣ 1 / 2
ʣ xy ʣ 1
T
xy ʣ 1 / 2
ʠ xy ʠ
yy ʣ
xx
xx
and
ʣ 1
xx ʣ xy ʣ 1
T
xy
yy ʣ
(C.39)
 
Search WWH ::




Custom Search