Information Technology Reference
In-Depth Information
(b) Compute:
T
1
2
(
Ax
(
t
))
(
Ax
(
t
))
E GeTLS
(
t
+
1
) =
1 ζ( t ) + ζ( t ) x T
( t ) x ( t )
T
= γ ( t )
2
( Ax ( t ))
( Ax ( t ))
a T
( t ) x ( t )
(c) If
E GeTLS ( t + 1 ) E GeTLS ( t ) 2
for a certain number of iterations (i.e., 100), then STOP.
(d) If t > t max ,thenSTOP.
(e) Update η ( t ) and ζ ( t ) .
Note : ζ( t ) must be equal to 1 well before the algorithm is stopped.
5.7 FURTHER CONSIDERATIONS
5.7.1 Homogeneous Linear Systems
Given an approximated homogeneous linear system in the form
Uz
0
(5.141)
m
where U
× n , the solution z (approximate linear relation between the
columns of U ) can be obtained solving the optimization problem called the
orthogonal L 2 approximation problem [98, p. 45; 173,188].
Definition 129 (Orthogonal L 2 Approximation Problem) Given the data
matrix U
m × n , the orthogonal L 2 approximation problem seeks
z T z = 1
min
z
Uz 2
subject to
(5.142)
n
The constraint in z is needed to avoid the trivial solution z = 0. The solution
to this problem is given by the eigenvector of U T U associated with the small-
est eigenvalue (i.e., the right singular vector of U associated with the smallest
singular value). Hence, in the framework of the theory presented in this chapter,
the problem can be solved by using the following three neurons (with the usual
assumptions on the additive noise in the coefficients of U ):
1. TLS EXIN .Takingas A (data matrix) n 1 columns of U and as b (obser-
vation vector) the remaining column [recall the equivalence between the
TLS problem and the homogeneous system (1.14) with the correspond-
ing solution as the intersection of the “minimum” right singular vector of
Search WWH ::




Custom Search