Information Technology Reference
In-Depth Information
But
A
T
I
−
b
b
T
b
−
1
b
T
A
=
I
−
b
b
T
b
−
1
b
T
A
2
=
P
b
A
2
2
2
(5.31)
So
v
min
is the right singular vector corresponding to the smallest singular value
of the matrix
P
b
A
,where
P
b
=
I
−
b
b
T
b
−
1
b
T
is a projection matrix that
projects the column space of
A
into the orthogonal complement of
b
(see Theorem
37 and [51]). Reversing the transformation yields
=
r
T
v
−
1
b
T
b
b
T
A
v
min
v
x
x
min
v
=
=
(5.32)
min
min
This is the DLS solution (1.55).
This theorem confirms the validity of eq. (5.21) as the error function for a
DLS neuron.
Remark 98 (DLS Null Initial Conditions)
As shown in eq.
(
5.21
)
,theDLS
error cost is not defined for null initial conditions. To allow this choice of ini-
tial conditions, which allows the neuron better properties and convergence, DLS
scheduling is introduced
(
discussed in Section 5.5
)
.
5.2.5 Error Functions: A Summary
Solving the OLS problem requires minimization of the cost function:
1
2
T
E
OLS
(
x
)
=
(
Ax
−
b
)
(
Ax
−
b
)
(5.33)
The TLS solution minimizes the sum of the squares of the orthogonal distances
(weighted squared residuals):
T
E
TLS
(
x
)
=
(
Ax
−
b
)
(
Ax
−
b
)
(5.34)
1
+
x
T
x
DLS requires minimization of the cost function:
T
1
2
(
Ax
−
b
)
(
Ax
−
b
)
x
T
x
E
DLS
(
x
)
=
(5.35)
These three error functions derive from the GeTLS error function (5.6) for the
values
ζ
=
0, 0.5, and 1, respectively.
5.2.6 GeTLS EXIN MADALINE
The GeTLS EXIN adaptive linear neuron (ADALINE) can also be
applied to multidimensional problems
AX
≈
B
m
×
d
. In this case,
d
neurons
Search WWH ::
Custom Search