Digital Signal Processing Reference
In-Depth Information
p
p 1
-1
-1
1
1
=0
1
-1
-1
p 2
p 2
y
NAND
=0
=1
-1
-1
1
p 3
p 3
0
p 1
p 2
p 1
p 2
1
1
1
1
-1
-1
y
NOR
=1
=0
1
1
p 3
p 3
Fig. 12.3
NAND and NOR operands realized with McC&P neurons
p 1
p 2
w 1
w 2
hardlim
1
y
0
w R
p R
b
1
Fig. 12.4
Single-neuron perceptron architecture
The improvement of neuron application possibilities appeared in the 50s and
60s of the last century when many researchers used to work on the so called
perceptron theory and related applications. The perceptron (Fig. 12.4 ) is basically
a single neuron structure with adjustable synaptic weights and a hard limiter.
Perceptrons are especially suited for simple problems in pattern classification. The
hard-limit transfer function gives a perceptron the ability to classify input vectors
by dividing the input space into two regions separated by a hyperplane. In the
example illustrated in Fig. 12.5 for a two-dimensional problem the separation of
the areas corresponding to two classes of events/elements is done with a straight
line. Additional ''error area'' is also marked in the figure, which denotes the fact
that for noisy signals the classification may be difficult when the points to be
classified are located too close to the decision boundary.
 
Search WWH ::




Custom Search