Information Technology Reference
In-Depth Information
Chapter 2
Neurocomputing with High Dimensional
Parameters
Abstract Neurocomputing has established its identity for robustness toward
ill-defined and noisy problems in science and engineering. This is due to the fact that
artificial neural networks have good ability of learning, generalization, and associa-
tion. In recent past, different kinds of neural networks are proposed and successfully
applied for various applications concerning single dimension parameters. Some of
the important variants are radial basis neural network, multilayer perceptron, support
vector machines, functional link networks, and higher order neural network. These
variants with single dimension parameters have been employed for various machine
learning problems in single and high dimensions. A single neuron can take only
real value as its input, therefore a network should be configured so that convention-
ally use as many neurons as the dimensions (parameters) in high dimensional data
for accepting each input. This type of configuration is sometimes unnatural and also
may not achieve satisfactory performance for high dimensional problems. It has been
revealed by extensive research work done in recent past that neural networks with
high dimension parameters have several advantages and better learning capability for
high dimensional problems over conventional one. Moreover, they have surprising
ability to learn and generalize phase information among the different components
simultaneously with magnitude, which is not possible with the conventional neural
network. There are two approaches to naturally extend the dimensionality of data
elements as single entity in high dimensional neural networks. In first line of attack
the number field is extended from real number (single dimension) to complex number
(two dimension), to quaternion (four dimension), to octanion (eight dimension). The
second tactic is to extend the dimensionality of data element using high dimensional
vector with scalar components, i.e., three dimension and N-dimension real-valued
vectors. Applications of these numbers and vectors to neural networks have been
extensively investigated in this chapter.
2.1 Neuro-Computing with Single Dimensional Parameters
In recent years, neurocomputing have emerged as a powerful technique for various
tasks such as function approximation, classification, clustering, and prediction in
wide spectrum of applications. Multilayer neural network and back-propagation
 
Search WWH ::




Custom Search