Information Technology Reference
In-Depth Information
do the derivatives possess any singular points. Recent studies have shown that the
orthogonal decision boundaries of a CVNN with a split-type activation function
provide them superior decision-making ability than their real-valued counterparts
[ 2 , 9 ]. The learning rules in complex backpropagation ( C BP) algorithm for CVNN
with this function (Eq. 3.3 ) posses linear combination of derivatives of real and imag-
inary components of an output function; hence effectively reduces the standstill in
learning. Such CVNN has outperformed not only in complex-valued problems but
also in real-valued problems, Chap. 7 .
3.3 Error Functions in Complex Domain
The role of error function for data analysis is emphasized from a CVNN viewpoint
in this section. In almost all investigations and applications involving ANNs, the BP
applied is the one developed over a quadratic EF. This error function (EF) may not
perform satisfactorily for all real life data with function approximation and classi-
fication. Literature points out many other functions that can take the place of the
traditional quadratic EF in data analysis but these EF-based applications have been
studied from a statistics viewpoint and not from a neural network viewpoint. Inci-
dentally, Rey (1983) pointed out that in statistical analysis replacing EF can yield
better results. It must be stressed here that the EF over, which the complex-valued
back-propagation algorithm ( C BP) was built recently also is the quadratic EF. How-
ever, a few researchers (Werbos and Titus (1978), Gill and Wright (1981), Fernandez
(1991), Ooyen and Nienhaus (1992)) have used different EF with their BP. While
complex error funtions (EFs) have not been investigated in a systematic fashion.
Replacing the EF assumes importance because practical data are prone to measure-
ment errors and outliers. If the quadratic EF were retained for analyzing data prone
to outliers and other errors, the curve of best-fit would not be appropriate because
the cost that accrues to the chosen EF would get enhanced due to the power term for
far-off points (outliers). Instead, if the quadratic EF were replaced with an absolute
function for example, the curve-fitting scheme for noise-and-error-prone data would
be more evenly placed because the cost accrued due to the these data would be of the
same order as the actual data points. This even-weighting results in a better curve-fit
than the one obtained by quadratic error based approximation. This section surveys
some EFs and studies BP and the C BP from an EF viewpoint.
3.3.1 Why Vary Error Functions
Most of the statistical methods used for practical data analysis employ a Quadratic
EF. Further, almost all learning algorithms for ANN reported in the literature, use a
mean square error deviation between the actual and predicted output as the EF. Rey
(1983) pointed out that by varying the EF in an optimization scheme, the result can
 
Search WWH ::




Custom Search