Biomedical Engineering Reference
In-Depth Information
ples. Section 2 presents the vocabulary needed for the description of dynamical
systems of any type, linear or nonlinear, large or small, continuous or discrete.
Section 3 presents the fundamental ideas relevant for understanding the behavior
of small systems, i.e., systems characterized by a small number of dynamical
variables. It begins with a discussion of the concept of nonlinearity itself, then
proceeds to build on it using two canonical examples: the damped, driven oscil-
lator and the logistic map. In section 4 new issues that arise in large systems are
introduced, again in the context of two characteristic examples: the cardiac sys-
tem and the Boolean model of genetic regulatory networks. It is hoped that these
discussions will provide a context that will help readers understand the import of
other chapters in this topic.
2.
DYNAMICAL SYSTEMS IN GENERAL
The term dynamical system refers to any physical or abstract entity whose
configuration at any given time can be specified by some set of numbers, called
system variables , and whose configuration at a later time is uniquely deter-
mined by its present and past configurations through a set of rules for the trans-
formation of the system variables. Two general types of transformation rules are
often encountered. In continuous-time systems the rules are expressed as equa-
tions that specify the time derivatives of the system variables in terms of their
current (and possible past) values. In such cases, the system variables are real
numbers that vary continuously in time. The Newtonian equations of motion
describing the trajectories of planets in the solar system represent a continuous-
time dynamical system. In discrete-time systems the rules are expressed as
equations giving new values of the system variables as functions of the current
(and possibly past) values. Though classical physics tells us that all systems are
continuous-time systems at their most fundamental level, it is often convenient
to use descriptions that describe the system configurations only at a discrete set
of times and describe the effects of the continuous evolution as discrete jumps
from one configuration to another.
A set of equations describing a continuous-time dynamical system takes the
form
x t
()
=
fx
( (); ,).
t
p
t
[1]
Here the components of the vector x are the system variables and the vector f
represents a function of all of the system variables at fixed values of the parame-
ters p . The overdot on the left indicates a first time derivative. 1 Note that f can
depend explicitly on time, as would be the case, for example, in a system driven
by a time-varying external force. In some systems, time delays associated with
Search WWH ::




Custom Search