Biomedical Engineering Reference
In-Depth Information
9.6. Evaluating Models of Complex Systems
Honerkamp (58) is great, but curiously almost unknown. Gershenfeld (245)
is an extraordinary readable encyclopedia of applied mathematics, especially
methods which can be used on real data. Gardiner (270) is also useful.
Monte Carlo . The old topic by Hammersley and Handscomb (140) is con-
cise, clear, and has no particular prerequisites beyond a working knowledge of
calculus and probability. (271) and (272) are both good introductions for readers
with some grasp of statistical mechanics. There are also very nice discussions in
(58,31,142). Beckerman (143) makes Monte Carlo methods the starting point for
a fascinating and highly unconventional exploration of statistical mechanics,
Markov random fields, synchronization, and cooperative computation in neural
and perceptual systems.
Experimental design . Bypass the cookbook texts on standard designs, and
consult Atkinson and Donev (155) directly.
Ecological inference . (273) is at once a good introduction, and the source
of important and practical new methods.
9.7. Information Theory
Information theory appeared in essentially its modern form with Shannon's
classic paper (274), though there had been predecessors in both communications
(275) and statistics, notably Fisher (see Kullback (159) for an exposition of
these notions), and similar ideas were developed by Wiener and von Neumann,
more or less independently of Shannon (56). Cover and Thomas (158) is, de-
servedly, the standard modern textbook and reference; it is highly suitable as an
introduction, and handles almost every question most users will, in practice,
want to ask. (276) is a more mathematically rigorous treatment, now free online.
On neural information theory, (172) is seminal, well-written, still very valuable,
and largely self-contained. On the relationship between physics and information,
the best reference is still the volume edited by Zurek (12), and the thought-
provoking paper by Margolus.
9.8. Complexity Measures
The best available survey of complexity measures is that by Badii and Politi
(10); the volume edited by Peliti and Vulpiani (277), while dated, is still valu-
able. Edmonds (278) is an online bibliography, fairly comprehensive through
1997. (195) has an extensive literature review.
On Kolmogorov complexity, see Li and Vitanyi (177). While the idea of
measuring complexity by the length of descriptions is usually credited to the trio
of Kolmogorov, Solomonoff, and Chaitin, it is implicit in von Neumann's 1949
lectures on the "Theory and Organization of Complicated Automata" (252, Part
I, especially pp. 42-56).
Search WWH ::




Custom Search