Information Technology Reference
In-Depth Information
λ min
>
1
2
2
w
λ min
< 1
1
O
t 0
t
t
Figure 2.6 Asymptotic behavior of the OJA
+
ODE for different values (circles) of the initial
conditions.
whose solution is given by
p 0
p 0 + ( 1 p 0 ) e 2 n 1 )( t t 0 )
p =
if p 0 =
1
(2.121)
The change in the time constant of the exponential with respect to OJA implies
a fundamental difference. Indeed, if λ n > 1, OJAn behaves as OJA (the OJA
expressions are valid here simply by replacing λ n with λ n 1). If λ n = 1, p = p 0 .
If λ n < 1, p 1, as expected from Theorem 48. Figure 2.6 summarizes the
results for different values of p 0 .
In summary: OJA + is not divergent and it does not suffer from the sudden
divergence, but, unfortunately, it requires the assumption that the smallest eigen-
value of the autocorrelation matrix of the input data is less than unity. If it cannot
be assumed in advance (e.g., for noisy data), OJA
+
may suffer from the sudden
divergence.
2.6.2.4 Simulation Results for the MCA Divergence The previous analy-
sis is illustrated by using, as a benchmark, the example in [124, pp. 294-295] and
[181]. A zero-mean Gaussian random vector x
(
t
)
is generated with the covariance
matrix
#
%
0
.
4035
0
.
2125
0
.
0954
$
&
R
=
0
.
2125
0
.
3703
0
.
2216
(2.122)
0
.
0954
0
.
2216
0
.
4159
and taken as an input vector. The learning rate is given by
α(
t
) =
const
=
0
.
01.
The algorithms are initialized at the true solution; that is,
4473] T
w(
0
) =
[0
.
4389
0
.
7793 0
.
 
Search WWH ::




Custom Search