Information Technology Reference
In-Depth Information
complex-valued neuron, the decision boundary of an N-dimensional vector neuron
consists of N hyperplanes, which intersect orthogonally with each other and divides
a decision region into N equal sections. Minsky and Papert (1969) considered the
parity problem the most difficult because output required is 1 if the input pattern
contains an odd number of 1 s and 0 otherwise. A solution for N-bit parity problem
obtained using a single N-dimensional vector neuron demonstrates its highest gen-
eralization ability. It is significant to emphasize here that the rational improvement
in the number of learning parameters and the number of layers could be achieved
with the N-dimensional vector-valued neuron in solving problems possessing high
dimensional parameters.
2.7 Concluding Remarks
The theories in neurocomputing have been developed to build mathematical mod-
els that mimic the computing power of the human brain. Their powerful processing
capability has been demonstrated in various applications of real domain. Traditional
neural networks parameters are real numbers and usually used to deal with single
dimension. Still, there are many applications, which deal with high dimensional sig-
nals. The easiest solution would be to consider a conventional real domain neural
network, where high dimensional signals are replaced by independent real-valued
signals. Such a real-valued neural network may be highly complex and unrealistic,
and besides such network is unable to perform mapping on a high dimension because
corresponding learning algorithms cannot preserve each point's angle in magnitude
and sense. An alternative is to introduce a neural network with high-dimensional para-
meters, which comprises of different components as real numbers, and comes with
phase information imbedded into it. This approach yields more efficient solution both
in terms of computational complexity and performance. Besides, they overcome the
users from huge network topology and large storage requirements, whereas enhances
the learning speed.
Another competitive advantages of neuro-computing with high- dimensional para-
meters is the ease with which they may be applied to poorly understood problems in
higher dimensions. Neuron is its basic working unit, which does not have predefined
meaning, and it evolves during learning in a manner which can characterize the target
function. A high dimensional neural network has natural tendency of acquiring high-
dimensional information in training, which include magnitudes and phase in a single
entity. They are specially useful in areas, where there is a need of capturing phase
information in signals, and must be retained all through the problem. This topic is an
attempt to investigate the functional capabilities of neurons with high-dimensional
parameters. The strength and effectiveness of the high-dimensional neural networks
have been extensively justified in successive chapters through simulations on dif-
ferent types of problems viz. classification, function approximation, and conformal
mapping.
 
Search WWH ::




Custom Search