Information Technology Reference
In-Depth Information
Fig. 6.5. Examples ( grey circles ) in two dimensions, represented in the extended
three-dimensional space. The separating hyperplane and the corresponding separat-
ing straight line in the original input space are shown
Remark. It may be useful to come back to the original N -dimensional input
space. The points x H
R N that satisfy
N
w j x j
=
w 0
j =1
lie on the hyperplane that is normal to the vector w =[ w 1 ,w 2 ,...,w N ] T ,
whose distance to the origin is the absolute value of
w 0
j =0 w 2
d 0 =
.
j
The relation between w and w is illustrated on Fig. 6.5. The separating hy-
perplane in the N -dimensional space is the intersection of the hyperplane in
extended space with the sub-space x 0 = 1. Clearly, the distances of the exam-
ples to the hyperplanes are different, depending on which space is considered.
To summarize, each weight vector w determines one hyperplane that sep-
arates the input space in two regions. Inputs x with positive projection onto
w have outputs equal to +1, whereas the others have outputs equal to
1. A
perceptron performs linear separations (or discriminations), since the equa-
tion of the separating (discriminant) surface is a linear function of the inputs.
The implementations of more complex separations need either neural networks
with hidden units or perceptrons with higher order potentials, like spherical
perceptrons or Support Vector Machines. These are described in the Sect. 6.5
“beyond linear separation” .
6.3.2 Aligned Field
Consider an input x k belonging to L M ,ofclass y k .The aligned field z k with
respect to the perceptron of weights w , is defined by
Search WWH ::




Custom Search