Digital Signal Processing Reference
In-Depth Information
Fig. 12.11 A ''winner takes
all'' network
the winner
a 1
a 2
a m
. . . . .
W
. . . . .
p 1
p 2
p 3
p R
only, thus especially for non-linear structures their design and further application
should be done with care [ 25 ].
The neural networks for pattern classification are usually single-output struc-
tures. They may, however, also have multiple outputs, each of them assigned to
specific purpose. Such networks can be used in multidimensional control schemes,
where the outputs of particular neurons of the output layer are sent to appropriate
points of the control structure.
A special case of such multi-output ANN—a ''winner takes all'' network—is
depicted in Fig. 12.11 . Here, the input x is given to all the network units (neurons)
at the same time. The competitive transfer function accepts a net input vector for a
layer and returns neuron outputs of 0 for all neurons except for the winner, the
neuron associated with the most positive element of net input. The winner is the
most activated neuron and only its weights are updated during training. The
Kohonen rule allows the weights of a neuron to learn an input vector, and because
of this it is useful in recognition applications [ 22 ].
Such a competitive network learns to categorize the input vectors presented to
it. If a neural network only needs to learn to categorize its input vectors, then a
competitive network will do. Competitive networks also learn the distribution of
inputs by dedicating more neurons to classifying parts of the input space with
higher densities of input (see Fig. 12.12 ). Such networks are also known as self-
organizing maps (SOM) that can learn to detect regularities and correlations in
their input and adapt their future responses to that input accordingly. SOM allow
neurons that are neighbors to the winning neuron to output values. Thus the
transition of output vectors is much smoother than that obtained with competitive
layers, where only one neuron has an output at a time. The SOM networks are
useful in many applications including clustering, visualization and quantization.
The data classification and clustering can also be performed with the so called
radial basis function (RBF) networks. The RBF nets are able to approximate an
unknown function with a linear combination of non-linear functions (basis func-
tions) that have radial symmetry with respect to a centre. The RBF networks of at
Search WWH ::




Custom Search