Information Technology Reference
In-Depth Information
a two as well as single-dimension problems [ 25 - 27 ]. The two dimension number
representation comprises of real numbers and comes with phase information embed-
ded into it. Therefore, they are significant where it is necessary to capture phase
information in the data. Complex numbers form a superset of real numbers, an alge-
braic structure that defines real-world phenomenon like signal amplitude and phase.
These are useful in analyzing various mathematical and geometrical relationships
on plane. For nearly a decade, the extension of real-valued neurons for operation
on complex signals [ 26 , 28 - 31 ] have received much attention in the field of neural
networks. Complex-valued neural network provides a faster convergence with better
results, reduction in learning parameters (network topology), and ability to learn two
dimension motion on plane [ 32 , 33 ]. The main motivation in designing of proposed
neuron models is to utilize the promising capabilities of nonlinear aggregation func-
tion as well as complex numbers. In this chapter, author considers higher-order neural
architectures whose boom is strengthened by the introduction of complex domain
implementation and corresponding learning rules.
4.3.1 Artificial Neuron Models
In literature, we generally find neuron models that comprise of summation, radial
basis, or product aggregation function, as basic unit of multilayer neural network.
All these models and their networks have shown their merits as well as demerits. The
MLP constructs, using summation function, a global approximation for input-output
mapping, while an RBF network, using exponentially decaying localized nonlinear-
ity, constructs local approximation to input-output mapping. This section presents
two compensatory type novel aggregation functions [ 34 ] for first two artificial neu-
rons and a root-power mean-based aggregation function [ 25 ] for third neuron. First
two neuron produce the net potential as linear or nonlinear composition of basic
summation and radial basis operations over a set of input signals, respectively. These
neurons has a compensatory basis function, whose compensatory parameters are
tuned during the task of learning to model the underlying parametric relationship
between summation and radial basis operations for concerned application.
In the third neuron, the aggregation function is so general that it embraces over all
the averaging (aggregation) functions betweenminima andmaxima. The aggregation
process in this neuron is based on the idea underlying the weighted root-power mean
[ 35 , 36 ] of all inputs in the space. The variation in generalization parameter realizes
its higher-order structure. These three neuron models are inspired from the class of
higher-order neurons, but these models have a simpler yet compact structure without
any requirement of sparseness, which was necessary in other higher-order neurons
for practical learning. In the sequel, a well-defined learning rules for a multilayer
network of these neurons is developed. These neuron models can be used in their
original form in a network like conventional neurons or in combination with the
conventional neurons. Any of these neurons may receive a vector of input and has a
complex-valued aggregator that transform the weighted input into a complex-valued
net potential. The neuron also has an activation function in complex domain that gives
 
Search WWH ::




Custom Search