Databases Reference
In-Depth Information
FIGURE 2.3: Estimated number of signals/messages generated, C by each
neuron within a single layer of a common neural network scheme for several
numbers of iterations, w.
manner, i.e., no collaboration between nodes. This effect of iterative procedure
reduction is also experienced within a Kohonen SOM network.
The inability of most existing neural network schemes to scale up stems
from their complex nature and iterative learning procedures. Furthermore,
the training-validation-test mechanism produces significant delays in execu-
tion and creates a strong dependency between the training and test data.
Therefore, there is a need to consider an algorithm that has limited complex-
ity and training-test data dependency.
Although the scalability of a number of the existing neutral network schemes
is limited, some intelligent recognition schemes are able to scale up with large-
scale data, e.g., the one-shot learning Graph Neuron (GN) algorithm. The
Graph Neuron (GN) is a graph-based associative memory algorithm [2, 31,
32] that is highly scalable and implements single-cycle learning for pattern
recognition. Furthermore, the GN adopts an in-network processing approach,
i.e., computational processes occur within the body of the network itself. The
GN has been proposed for several pattern recognition implementations [33,
34].
Search WWH ::




Custom Search