Information Technology Reference
In-Depth Information
implies that the MCA solution ( TLS ) is not optimal ( see [ 98 ]) ; a robust version
of MCA EXIN ( NMCA EXIN ) that overcomes the problem is presented in the
next section.
3.2 ROBUST MCA EXIN NONLINEAR NEURON (NMCA EXIN)
As shown in Section 1.10, the TLS criterion (which gives a solution parallel to
the minor eigenvector) is not optimal if the data errors are impulsive noise or
colored noise with unknown covariance or if some strong outliers exist. Instead
of using the TLS criterion, Oja and Wang [143-145], suggest an alternative
criterion that minimizes the sum of certain loss functions of the orthogonal
distance:
f w
N
N
T x ( i )
w
J f (w) =
=
f ( d i )
= 0
(3.2)
0
T
w
i = 1
i = 1
This is the TLS version of the robust LS regression. In robust LS the vertical
distances are taken instead of the perpendicular signed distances d i . Solutions
for the weights are among the M-estimates, which are the most popular of the
robust regression varieties [91].
Definition 81 (Robust MCA) The weight vector that minimizes the criterion
J f (w) is called a robust minor component .
The loss function f must be nonnegative, convex with its minimum at zero,
increasing less rapidly than the square function, and even [91]. Convexity pre-
vents multiple minimum points, and the slow increase dampens the effect of
strong outliers on the solution. The derivative of the loss function, g ( y ) =
df ( y )/ dy is called the influence function [80]. Depending on the choice of the
loss function, several different learning laws are obtained. The most important
loss functions are [21]:
Logistic function:
1
β
f
(
y
) =
ln cosh
β y
(3.3)
with corresponding influence function
g ( y ) =
tanh
β y
(3.4)
Absolute value function:
f ( y ) = | y |
(3.5)
with corresponding influence function
g ( y ) = sign y
(3.6)
Search WWH ::




Custom Search