Information Technology Reference
In-Depth Information
In order to understand the building of a machine using the adatron algorithm,
the discriminant function, relying on N data samples
x and the corresponding
weight multipliers
w , should be written as
N
¦
T
T
fx
()
xw b
D
xx b
,
(10.23)
i
i
i
1
and the machine output function as
yx
() sgn[ ()]
f x
.
(10.24)
D
D
x i1
f ( x )
Sgn ( f ( x ))
x i2
Sum
:
:
:
:
:
:
x iN
D
Figure 10.4. Adatron-algorithm-based perceptron
Equations (10.23) and (10.24) define the structure of a data-dependent machine,
shown in Figure 10.4, in accordance with a perceptron with b = +1 as its bias input.
The idea of adatron was born during the search for a perceptron with optimal
stability. Among the best iterative computational proposals for the design of such a
perceptron, the adatron algorithm has proven to be the best one, since it
theoretically promises - if the problem solution exists - to deliver an optimal
solution with an exponential speed of convergence. The adatron algorithm is a
kernel-based on-line algorithm for a learning perceptron under the premise that it
operates in a feature space in which it is supposed that a maximal margin
hyperplane exists.
10.2.2 Machine Implementation
After presenting the support vector machines concept and the aspects of its
implementation, we would now like to summarise some essential issues and give a
typical example of a support vector machine based on the RBF function as its
kernel function (Figure 10.5). In doing this, we would first like to remind that the
decision methodology of a support vector machine is based on implementation of
the following two successive steps:
x mapping the training points by a nonlinear function ij to a sufficiently high-
dimensional feature space in which the training points are linearly
separable
Search WWH ::




Custom Search