Information Technology Reference
In-Depth Information
Figure 3.5 First example for MSA EXIN: eigenvalue estimation.
conditions are w 1 ( 0 ) = e 1 and w 2 ( 0 ) = e 2 ,where e i is the coordinate vector
with all zeros except the i th element, which is 1. MSA EXIN yields, after 1000
iterations, the following estimations:
φ 1 ( 1000 )
1 ( 1000 ) =
λ
4215] T
=
1
.
0510
w
(
1000
) =
[0
.
9152, 0
.
0278,
0
.
0054, 0
.
1
1
)
2 ( 1000 ) =
φ
(
1000
2
λ 2 =
0407] T
1
.
1077
w 2 (
1000
) =
[0
.
0185, 0
.
7165,
0
.
7129, 0
.
which are in very good agreement with the true values. Define P ij as
z i w j / z i 2 w j 2 . Figure 3.5 is a temporal plot of the eigenvalue estimation.
Note the loss of step, which begins after about 3000 iterations. After 6000
iterations the two computed eigenvalues are equal because the two weight
vectors,
2 , are parallel. This can also be seen clearly in Figure 3.6,
which is a plot of P 12 . Figures 3.7 and 3.8 show the accurate recovery of
the minor subspace. Figure 3.9 is an estimation of the eigenvalues when the
experimental setup of Section 2.6.2.4 is used (i.e., the true solutions are taken
as initial conditions; second example). The plot shows both that the loss of step
happens after the minor component direction is reached and that there is no
sudden divergence.
MSA LUO has the same learning rate as MSA EXIN and initial conditions
w
1 and
w
w
(
0
) =
e 1 and
w
(
0
) =
e 2 . MSA LUO yields, after 1000 iterations, the follow-
1
2
ing estimations:
φ 1 ( 1000 )
1 ( 1000 ) = 1 . 0508
λ 1 =
w 1 ( 1000 ) = [0 . 9154, 0 . 0219, 0 . 0160, 0 . 4362] T
Search WWH ::




Custom Search