Databases Reference
In-Depth Information
a set of distributed computational networks working collaboratively can scale
the pattern recognition scheme in response to an increasing number of features.
In addition, the performance of this multi-feature scheme can be improved
by a single-cycle learning distributed pattern recognition algorithm, such as
the DHGN. In contrast to other contemporary machine learning approaches,
our approach allows induction of new patterns in a fixed number of steps.
While doing so, it exhibits a high level of scalability, i.e., the performance
and accuracy do not degrade as the number of stored patterns increases.
The pattern recognition capability remains comparable with contemporary
approaches, such as the support vector machine (SVM), self-organizing map
(SOM), and artificial neural network (ANN). Furthermore, all computations
are completed within the pre-defined number of steps. The one-shot learning
in this method is achieved by sidestepping the commonly used error/energy
minimization and random walk approaches. The network functions as a ma-
trix that holds all possible solutions for the problem domain. The DHGN
approach finds and refines the initial solution by passing the results through
a pyramidal hierarchy of similar arrays. In doing so, it eliminates/resolves
pattern defects; distortions up to 20% are tolerated [64]. Previously encoun-
tered patterns are revealed and new patterns are memorized without the loss
of stored information. In fact, the pattern recognition accuracy continues to
improve as the network processes more sensory inputs [3].To achieve this goal,
the DHGN distributed pattern recognition algorithm is extended for multi-
feature recognition and the analysis of complex data.
7.1 Data Features for Pattern Recognition
Consider the data representation shown in Figure 7.1. Using the mean pixel
value as a feature for a set of images, we are considering the following one-
dimensional problem: determine to which class a particular image belongs.
However, as another feature is added, e.g., the standard deviation of the pixel
value, additional computation is required to determine the correlations be-
tween features that produce distinctive classes of images. As more features are
added, the computational costs of determining the correlations become pro-
gressively higher. According to Theodoridis and Koutroumbas [85], although
two features may carry good classification if treated separately, their high
mutual correlations implies that there is little gained by combining these fea-
tures in the feature vector. The increased complexity does not benefit the
recognition process.
In classification algorithms of existing pattern recognition schemes, the
number of features directly translates to the number of classifier parame-
ters. Therefore, increasing the number of features leads to complexity. The
Search WWH ::




Custom Search