Information Technology Reference
In-Depth Information
an MCA algorithm can replace the GeTLS EXIN algorithm by using matrix K
as autocorrelation matrix. The estimated minor component must be scaled by
using eq. (6.22) and normalized by constraining the last component equal to
1.
Choosing K as a autocorrelation matrix implies that the MCA EXIN neuron is
fed by input vectors m i ,definedas
a i
2
T
b i
m i
=
,
2
(6.23)
ζ
(
1
ζ )
a i being the column vector representing the i th row of matrix A .TheMCAEXIN
neuron whose input is preprocessed by means of eq. (6.23) is called the GeMCA
EXIN neuron .
6.2 ANALYSIS OF MATRIX K
Matrix K can be decomposed [9, Prop. 2.8.3, p. 43] as
A T A
ζ
I n
0
ζ
1 ζ
0
A + b
I n
A + b T
ζ
1 ζ
2 K =
2
2
1 ζ
b
1
0 T
0 T
1
A T A
ζ
I n
0
0
ζ
1 ζ
I n
x OLS
=
ζ
2
2
1 ζ
x OLS
1
b
0 T
0 T
1
1
ζ
S T K 1 S
=
(6.24)
where
I A A T A 1 A T b
b = P A b =
(6.25)
represents the component of b orthogonal to the column space of A .Matrix P A
is the corresponding orthogonal projection matrix. Then , as seen before, b
2
2
is the sum of squares of the OLS residuals. It can be deduced that matrix K is
congruent to matrix K 1 and then, for Sylvester's theorem, it inherits the same
inertia [i.e., K is positive semidefinite, which is also evident from eq. (6.17)].
This analysis also yields the value of the determinant of matrix K :
det A T A
2 ζ
2
2
2 ( 1 ζ )
2
2
det A T A
b
b
det K =
=
(6.26)
2 n + 1
ζ
n
( 1 ζ )
By looking at matrix K [see eq. (6.17)] and identifying the principal submatrix
A T A / 2 ζ by means of the interlacing theorem, it follows i = 1, ... , n that
α i + 1 σ 2
i
2 ζ
(6.27)
Search WWH ::




Custom Search