Databases Reference
In-Depth Information
Apart from neural networks, machine learning is also widely used in pat-
tern recognition. Machine learning, as defined by Nilsson [12] refers to changes
in the systems that perform tasks associated with artificial intelligence (AI).
Machine learning is also considered to be a deterministic approach for pattern
recognition. It is commonly used in conjunction with other neural network
schemes. Some examples of machine learning approaches include genetic pro-
gramming, support vector machines (SVMs), and Bayesian networks.
Implementations of machine learning approaches for pattern recognition
require a priori knowledge of the types of functions or kernel machines to be
used for a specific recognition domain. For instance, a linear SVM has the
ability to perform classification on binary problems and is not suitable for
multi-class domains. For the machine to work on a specific set of data, the
machine learning approach commonly requires extensive and iterative learning
procedures to obtain the best parameter estimates. These two issues affect
the feasibility of using the machine learning approach to design a scalable and
generic pattern recognition scheme.
Machine learning approaches based on ANNs offer low levels of both scala-
bility and adaptability, which are mainly due to the following criteria of neural
network approaches:
1. Some neural network/machine learning approaches can be conducted
within a parallel environment, e.g., Hopfield networks and feed-forward
neural networks. This parallelization capability enables recognition to be
conducted on large-scale data. However, the complexity of these algo-
rithms hinders its capability to perform pattern recognition in a purely
parallel manner.
2. Enhancements in unsupervised machine learning schemes, such as the
K-mean clustering algorithm, provide an opportunity for heterogeneous
patterns and data to be used in the recognition processes. Nevertheless,
these algorithms require strenuous training and complex recognition pro-
cedures.
3. Limited storage capacity. For example, to obtain optimum recognition,
the number of estimated random patterns stored by the Hopfield net-
work is 0.138N where N represents the number of units in the network
[13].
4. Some neural networks can learn from the data used in the recognition
process. However, memorization by the learning process requires a large
number of similar class data.
In addition, ANNs and other machine learning approaches suffer from a num-
ber of issues, including the following:
1. For many neural network schemes, the iterative procedure in the weight
adjustment during data training, such as the feed-forward network, im-
poses significant delays in processing.
Search WWH ::




Custom Search