Information Technology Reference
In-Depth Information
input-output signals are replaced by a pair of independent signals in real domain.
But, there is an issue, how to deal with phase information which is well embedded
in any complex number or signal? Therefore, the transformation C
R ,used
in the derivation of the learning algorithms, will affect the phase approximation
capabilities of these algorithms. For better phase approximation, one needs to use an
algorithmwhich simultaneouslyminimizes both themagnitude and phase errors [ 10 ].
Moreover, these learning algorithms will require a huge computational effort during
training. Therefore, it is reasonable to develop a CVNNs and a fast learning algorithm
to overcome the above-mentioned issues. A learning in complex domain may require
the input, output, weights, and activation functions all in complex domain. The second
approach is more involved in the sense that there is a need to define a complex
activation function, complex error function (EF) and based on this definition a new
learning algorithm has to be introduced in complex domain. The benefit of this
approach is that it yields a more efficient structure than from real domain neuron
both in terms of computational complexity as well as performance. Moreover, this
approach is realistic because it take care of phase information along with independent
components during learning process.
The desired properties of a CVNN were stressed by the fully complex-valued
multi-layer perceptron (CMLP) network and its gradient descent-based learning
algorithm was derived by Kim and Adali [ 11 ]. Subsequently, a fully complex-valued
radial basis function (CRBF) network and its gradient descent-based learning algo-
rithm was developed in [ 6 , 12 ]. Further studies have shown that the orthogonal deci-
sion boundaries of a CVNN with a split-type activation function provide them with
superior decision making ability than their real-valued counterparts [ 2 ]. Its 2D struc-
ture of error propagation reduces the problem of saturation in learning and offers
faster convergence. This also generated an increased interest among researchers
to develop complex-valued classifiers to solve real-valued classification problems.
The multi valued neural network (MVNN) [ 3 ], the single layer network with phase
encoded inputs [ 4 , 13 ], complex-valued extreme learning machine (CELM) [ 14 ] and
the phase encoded complex-valued extreme learning machine (PE-CELM) [ 5 ]are
some of the CVNNs available in the literature. CVNN is a generalization of the ANN
to the complex domain where all parameters and functions are in complex domain.
A recent investigation (Tohru Nitta 1997 and B K Tripathi 2011) reports that the size
of the CVNN could be smaller than that of an ANN for the same problem (as each
complex variable can take two real variables. Also the weights are complex which
implies that they hold twice the information as real weights would) but it yields
better and accurate results in comparison to equivalent real-valued neural network
(RVNN). The application field of complex domain neuron is very wide. It is not
easy to imagine areas dealing with 2-D parameters without the realm of complex
numbers.
−→
3.1.2 Out Performance Over Real Domain Neuron
In [ 7 ], author described a set of complex-valued learning patterns using two rules.
This pattern set has a clear correspondencewith popular XORproblemin real domain,
 
Search WWH ::




Custom Search