Biomedical Engineering Reference
In-Depth Information
cal properties to ensure correct operation. The
other one, which is borrowed from the theory
of Hamiltonian and gradient systems, is that a
“natural” Lyapunov function is associated to
define structures of systems (Halanay & Răsvan,
1993; Răsvan, 1998). The other methods i.e. the
frequency domain inequality Popov type and the
comparison principles are useful substitutes for
those cases where the Lyapunov function(al) is
not available. Even in such cases we should ex-
pect development of the Lyapunov method since
the Popov like inequalities are of matrix type for
systems with several nonlinear functions - the
case of AI devices, in particular of the neural
networks.
From this point of view we hopefully consider
that involvement of new mathematical develop-
ment will bring new qualitative results for the
dynamical systems of AI e.g. in the case of systems
with time delays.
sary to discuss more about the method of V. L.
Kharitonov of constructing Lyapunov functionals
for time delay systems. One may find here the
old idea of stating from a given derivative for the
Lyapunov functional in order to reach via a suit-
able and not very abstract construction, a good
Lyapunov functional satisfying suitable estimates
ensuring the required qualitative behavior. The
Kharitonov-like constructions for various types
of delay equations and various behaviors define
an interesting research direction. It appears that
such research could be rewarding in both fields
of dynamical systems and AI devices.
REFERENCES
Arakawa, M., Hasegawa, K. & Funatsu, K. (2003).
Application of the Novel Molecular Alignment
Method Using the Hopfield Neural Network to
3D-QSAR, Journal of Chemical Information
Computer Science, 43 (5), 1396 -1402.
FUTURE RESEARCH DIRECTIONS
Atencia, M., Joya, G. & Sandoval, F. (2004).
Parametric identification of robotic systems with
stable time-varying Hopfield networks. Neural
Computing and Applications , Springer London,
13 (4), 270-280.
The interaction of the two paradigms - emergent
capabilities of neural networks and gradient-like
behavior of the dynamical systems as best achiev-
able dynamic behavior for normal operation of this
network - has far going implications. One may
discover two corresponding research directions:
field extension i.e. consideration of new network
structures for AI but also for other applications
and next, instrument development: here we in-
clude Lyapunov theory, generalized (in the La
Salle sense) Lyapunov functions, new Lyapunov
functionals for time delay systems.
Since conditions given by Lyapunov methods
are only sufficient ones, obtaining sharper criteria
is a quasi-permanent task. This goal is achievable
by improving sharpness of the estimates, i.e. by
making use of sharper inequalities. The research-
ers are thus in position to apply the entire set of
procedures and results of the Lyapunov methods,
also to complete and to extend it. Here it is neces-
Bose, N.K. & Liang, P. (1996). Neural Network
Fundamentals with Graphs, Algorithms and Ap-
plications , U.S.A.: McGraw-Hill, Inc.
Chua, L. & Yang, L. (1988). Cellular neural net-
works: theory and applications, IEEE Transactions
on Circuits and Systems , CAS-35 , 1257-1290.
Chung, C.H. & Lee, K.S. (1991). Hopfield network
application to optimal edge selection. Neural
Networks, IEEE International Joint Conference
on , vol.2, 1542 - 1547.
Cohen, M. A. & Grossberg, S. (1983). Absolute sta-
bility of pattern formation and parallel storage by
competitive neural networks. IEEE Transactions
of Systems, Man & Cybernetics, 13 , 815-826.
Search WWH ::




Custom Search