Information Technology Reference
In-Depth Information
interpretation of vector in different dimensions had been given through the tuples of
scalar components. Vectors in a three dimensional space (or R 3 ) can be represented
as coordinate vectors in a Cartesian coordinate system and can be identified with
an ordered list of three real numbers (tuples [x, y, z]). These numbers are typically
called the scalar projections (or scalar components) of the vector on the axes of
the coordinate system. In wide spectrum, a vector in n-dimensional space (or R n or
spatial vector) is a geometric quantity having magnitude (or length) and direction
expressed numerically as n-tuples, splitting the entire quantity into its orthogonal-
axis components. The vector-valued neurons were introduced as natural extension of
conventional real-valued neuron, which influence the behavioral characteristics of the
neuron in high dimensional space. We live in a three dimensional world, certainly
all of our movements are in 3D. There are many natural aspects of learning 3-D
motion in space particularly through neurocomputing. The purpose of 3D vector-
valued neural network to 3D geometry is that it makes the neurocomputing study
simple and elegant.
2.3 Neurocomputing with Two Dimensional Parameters
The neurocomputing has found application in almost all industry and every branch
of science and technology. Our understanding of the ANN improved over the years
with the much light research in the direction threw, with the lasting contributions
of McCulloch and Pitts (1943), Donald Hebb (1949), Minsky (1954), Rosenblatt
(1958), Minsky and Papert (1969), Werbos (1974), Fukushima and Miyaka (1980),
John Hopfield (1982), Nitta (1997), Adeli (2002), Aizenberg (2007), Tripathi (2010)
to name a few. Among the most recent developments in the area are the complex
variable based neural networks (CVNN) that represent a second generation of archi-
tectures and also scored over the standard real variable based networks (ANN) in
certain aspects. All the parameters including synaptic weights, bias, input-output,
and signals flowing through network are complex numbers, aggregation, and activa-
tion functions are also in complex domain. Since it operates in the complex vari-
ables setting, the conventional Back-Propagation Algorithm (BP) that trains the
ANN is not suitable to train the CVNN. The operations on functions in complex
domain are not as straightforward as in real domain; therefore, variations in exten-
sion of the BP to the complex variables was reported by Leung and Haykin (2010)
[ 1 ], Piazza (1992) [ 2 ], Nitta (2000) [ 3 ], Aizenberg (2007) [ 4 ], Adeli (2002) [ 5 ]
called the Back-Propagation Algorithm in Complex Domain ( C BP). It is imperative
that we study the ANNs with two-dimensional parameters with a view to investi-
gate how the new tools of approximation perform in comparison with the existing
ones.
In order to preserve the relationship between phase and magnitude in signals,
one certainly requires a mathematical representation; and this representation is only
possible in the domain of complex numbers. Hence, the model representation of the
systems involving these signals should deal with complex values rather than real
 
Search WWH ::




Custom Search