Biomedical Engineering Reference
In-Depth Information
Nonlinear and Nonparametric Models . Nonlinear models are obviously
appealing, and when a particular parametric form of model is available, rea-
sonably straightforward modifications of the linear machinery can be used to fit,
evaluate and forecast the model (55, chap. 9). However, it is often impractical to
settle on a good parametric form beforehand. In these cases, one must turn to
nonparametric models, as discussed in §2.2; neural networks are a particular
favorite here (35). The so-called kernel smoothing methods are also particu-
larly well-developed for time series, and often perform almost as well as para-
metric models (66). Finally, information theory provides universal prediction
methods , which promise to asymptotically approach the best possible predic-
tion, starting from exactly no background knowledge. This power is paid for by
demanding a long initial training phase used to infer the structure of the process,
when predictions are much worse than many other methods could deliver (67).
3.4. The Nonlinear Dynamics Approach
The younger approach to the analysis of time series comes from nonlinear
dynamics, and is intimately bound up with the state-space approach described in
§3.1 above. The idea is that the dynamics on the state space can be determined
directly from observations, at least if certain conditions are met.
The central result here is the Takens Embedding Theorem (68); a simpli-
fied, slightly inaccurate version is as follows. Suppose the d -dimensional state
vector x t evolves according to an unknown but continuous and (crucially) deter-
ministic dynamic. Suppose, too, that the one-dimensional observable y is a
smooth function of x , and "coupled" to all the components of x . Now at any time
we can look not just at the present measurement y ( t ), but also at observations
made at times removed from us by multiples of some lag U: y t - U , y t -2 U , etc. If we
use k lags, we have a k -dimensional vector. One might expect that, as the num-
ber of lags is increased, the motion in the lagged space will become more and
more predictable, and perhaps in the limit k would become deterministic. In
fact, the dynamics of the lagged vectors become deterministic at a finite dimen-
sion; not only that, but the deterministic dynamics are completely equivalent to
those of the original state space! (More exactly, they are related by a smooth,
invertible change of coordinates, or diffeomorphism .) The magic embedding
dimension k is at most 2 d + 1, and often less.
Given an appropriate reconstruction via embedding, one can investigate
many aspects of the dynamics. Because the reconstructed space is related to the
original state space by a smooth change of coordinates, any geometric property
that survives such treatment is the same for both spaces. These include the di-
mension of the attractor, the Lyapunov exponents (which measure the degree of
sensitivity to initial conditions), and certain qualitative properties of the autocor-
relation function and power spectrum ("correlation dimension"). Also preserved
Search WWH ::




Custom Search