Biomedical Engineering Reference
In-Depth Information
accelerate the convergence of back-propagation is
to adjust the learning rate parameter, as has been
done during our case study. However, there exist
some other means of improving the performance
of MLP either with the back-propagation algorithm
or using other algorithms instead. An example of
the former is to set an adaptive learning rate, that
is, a learning rate that keeps the learning step size
as large as possible while keeping learning stable.
In this case, the learning rate is made responsive
to the complexity of the local error surface. An
example of an alternative algorithm is the so-
called 'Conjugate Gradient Algorithms' (Lin &
Lee, 1996). The basic back-propagation algorithm
adjusts the weights in the steepest descent direc-
tion (negative of the gradient); the direction in
which the performance function is decreasing
most rapidly. It turns out that, although the func-
tion decreases most rapidly along the negative of
the gradient, this does not necessarily produce
the fastest convergence. In Conjugate Gradient
Algorithms, a search is performed along conju-
gate directions, which produces generally faster
convergence than steepest descent directions.
In GNMM, rule extraction is based on the
approximation of the hidden neurons' tanh acti-
vation function (equation (6) and equation (9)).
Such an approximation is derived through the
numerical analysis of Sequential Quadratic Pro-
gramming. Despite the fact that the coefficients
used in equation (9) are all kept to 14 significant
digits (maximum in MATLAB v7.2), as in any
approximation, there are always associated errors.
Thus, methods that extract regression rules from
ANN with high accuracy are desirable. Since
neural networks are low-level computational
structures that perform well when dealing with
raw data, while fuzzy logic deals with reasoning
on a higher level, using linguistic information
acquired from domain experts, rule extraction
from such a hybrid neuro-fuzzy system would
be easier and more accurate. In particular, Kas-
abov (2001) proposed an Evolving Fuzzy Neural
Network (EFuNN), which is a hybrid fuzzy ANN
intelligent systems paradigm. In EFuNN, the
system allows for the construction of fuzzy rules
from the network weights, and hence knowledge
extraction. Furthermore, it implements a strategy
of dynamically growing and pruning the con-
nectionist (i.e. ANN) architecture. Therefore, a
system that integrates GNMM and EFuNN would
offer a promising approach to the prediction of
longitudinal dispersion and rule extraction.
Moreover, GNMM as a data driven method
relies heavily on the quality of the data. Thus,
future works may also include applications of
GNMM to some more comprehensive data sets
to improve the performance of GNMM.
REFERENCES
Bishop, C. M. (1995). Neural networks for pattern
recognition . Oxford: Oxford University Press.
Bryson, A. E., & Ho, Y. C. (1975). Applied optimal
control : John Wiley & Sons.
Cannas, B., Fanni, A., See, L., & Sias, G. (2006).
Data preprocessing for river flow forecasting us-
ing neural networks: Wavelet transforms and data
partitioning. Time Series Analysis in Hydrology,
31 (18), 1164-1171.
Cybenko, G. (1989). Approximation by superpo-
sitions of a sigmoidal function. Mathematics of
Control, Signals, and Systems, 2 , 303-314.
Deng, Z. Q., Singh, V. P., & Bengtsson, L. (2001).
Longitudinal dispersion coefficient in straight
rivers. Journal of Hydraulic Engineering, 127 (1),
919-927.
Fischer, H. B., List, E. J., Koh, R. C. Y., Imberger,
J., & Brooks, N. H. (1979). Mixing in inland and
coastal waters . New York: Academic Press.
Gardner, J. W., Boilot, P., & Hines, E. L. (2005).
Enhancing electronic nose performance by sensor
selection using a new integer-based genetic algo-
rithm approach. ISOEN 2003 - Selected Papers
Search WWH ::




Custom Search