Digital Signal Processing Reference
In-Depth Information
1. NNs that consist of an input layer and output layer where the neu-
rons of the input layer are connected to the neurons of the output
layer with feedforward connections, and the neurons of the output
layer are connected with lateral connections
2. NNs of multiple layers in which the self-organization proceeds from
one layer to another
There are two self-organized learning methods:
1. Hebbian learning that yields NNs that extract the principal com-
ponents 1 , 2
2. Competitive learning that yields K -means clustering 3
Self-organized learning is essentially a repetitive updating of NN synaptic
weights as a response to input patterns, according to a set of prescribed rules,
until a final configuration is obtained. 2 A number of observations have mo-
tivated the research toward self-organized learning. It is worth noting that
in 1952 Turing stated that “global ordering can arise from local interactions,”
and von der Malsburg observed that self-organization is achieved through
self-amplification, competition, and cooperation of the synaptic weights of
the NN (see Reference 2, and references therein). In this chapter, we focus on
competitive learning and on self-organizing maps (SOM) in particular. The
latter can be viewed as a computational procedure for finding a discrete ap-
proximation of principal curves. 4 Principal curves could be conceived of as a
nonlinear principal component analysis method. 1
Self-organizing maps are based on competitive learning. That is, the out-
put neurons of the network compete among themselves to be activated, and
accordingly only one output neuron or one neuron per group is active at each
time instant. The neuron that wins the competition is called a winner-takes-all
neuron. In an SOM, the neurons are placed at the nodes of a lattice. Although
high-dimensional lattices could also be employed, a one- or two-dimensional
map is frequently used, because such maps facilitate data visualization. Dur-
ing the training that implements competitive learning, the neurons are tuned
selectively to various input patterns or classes of input patterns. In addition,
the coordinates of the neurons become ordered so that a meaningful coordi-
nate system for the different intrinsic statistical properties of the input data,
the so-called features ,iscreated over the lattice. The seminal work of Kohonen
dominates this research field. 5-7
The outline of the chapter is as follows. The basic feature-mapping tech-
niques are reviewed in Section 11.2. Kohonen's SOM is described in Sec-
tion 11.3. The convergence analysis of Kohonen's SOM is treated in Sec-
tion 11.4. SOM properties are outlined in Section 11.5. Variants of SOMs based
on robust statistics are described in Section 11.6, which also discusses their
applications to color image quantization and document organization and re-
trieval. A class of split-merge SOMs that incorporate outlier rejection and
cluster validity tests are analyzed in Section 11.7; applications of split-merge
Search WWH ::




Custom Search