Information Technology Reference
In-Depth Information
The complex number systems have been utilized to represent two-dimensional
data elements as a single entity in CVNN. The application of complex numbers to
neural networks have been extensively investigated [ 10 , 11 , 25 , 26 ]. As can be noted,
the CVNN looks exactly like the ANN as the neuron, how it operates, the architecture
are all similar. The difference lies is the fact that the weights, input-output, and bias
are complex numbers in the CVNN unlike ANN where they were real numbers.
The activation functions in the CVNN are complex valued in opposition with the
standard ANN where the functions were real valued. It is obvious that the theory
of complex variables and complex-valued functions must be applied for studying
the behavior of the CVNN. The activation function too was an extension of what
existed as the function of activation in the ANN but needed an improvisation to suit
to the complex variable-based ambience. In the process, new constraints surfaced (in
the form of Liouville Theorem) that had to be cleared for which a different search
had to be carried out. Importantly, CVNN have the potential power in analyzing
typical functions on a plane. They have also presented improved results even in
case of real-valued problems. This is because they are more efficient, fault-tolerant,
less sensitive to noise, and better at mimicing human-like characteristics (learning
and generalization) in single- and two-dimensional situations. The development of
the CVNN came about as an extension of the ANN and not as a prototype of the
neuronal arrangement in the brain. In essence, the CVNN is an extension of the
ANN but does not draw anymore from the actual neurons and their arrangement in
the brain. The complex-valued signals flowing through a complex domain network
are the unit of learning, which enable to learn 2-D motion of signals. In contrast, a
neural network in a real domain administers 1-D motion of signals. Thus a neural
network extended to complex domain has phase preserving nature during learning,
while an equivalent real domain network cannot. It is worth to mention here that
high-dimensional neural network does not learn only the magnitude along different
dimensions but also phase along different directions. Therefore, conformal mapping
on plane preserves the angles between oriented curves and the phase of each point
on the curve is also maintained during transformation.
Quaternion is a four-dimensional hypercomplex number system introduced by
Hamilton [ 27 ]. This number system has been extensively used in several fields, such
as modern mathematics, physics, control of satellites, computer graphics, etc. [ 7 ].
Applying quaternions to the field of neural networks has been recently explored in
an effort to naturally adapt and represent high-dimensional information by a quater-
nionic neuron, rather than complex-valued or real-valued neurons. Thus, there has
been a growing number of studies concerning the use of quaternions in neural net-
works. In principal the neuron in quaternionic domains, which is a four-dimensional
hypercomplex number system, can be decomposed and represented by two complex
numbers with two linearly independent bases.
Number is one of the most elementary notion not only in conventional comput-
ing but also in high-dimensional computing. We are very much fascinated with the
possibility of extending the notion of numbers to the high dimensions, which follow
straightforward algebraic properties. Thesemulty-component numberswere success-
fully utilized in high-dimensional neural networks. Unfortunately, the basic algebraic
Search WWH ::




Custom Search