Databases Reference
In-Depth Information
FIGURE 2.2: A Kohonen SOM node formation for a two-dimensional repre-
sentation.
Communications between nodes in existing neural networks, such as feed-
forward, Hopfield, and RBF neural networks, are highly iterative in nature,
which is due to the common weight adjustment/feedback methods used to
generate an optimum result during recognition processes for a single pat-
tern/pattern vector. Within multi-layer networks, the communication fre-
quency of each node depends on the number of nodes per layer. For each
pattern, the number of messages/signals communicated, C, by each node in
a multi-layer network with n nodes per layer can be determined using the
following equation:
C = nw
(2.2)
Where w is the number of iterations required for the weight adjustment.
An increase in the size of the network or the number of weight adjustment
iterations leads to a higher number of projected signals. Therefore, this ap-
proach is not an e cient scalable scheme for pattern recognition. Figure 2.3
illustrates this phenomenon.
A one-shot learning procedure is offered by some of the associative memory
(AM) schemes for pattern recognition, including morphological and Hamming
associative memories. This type of procedure reduces the need for an iterative
process to derive optimum recognition results. Furthermore, this type of neural
network performs lattice-based operations, in which communications between
nodes are kept to a minimum, and operations are performed in a singular
Search WWH ::




Custom Search