Biomedical Engineering Reference
In-Depth Information
1 Introduction
''Classical science is a conversation between theory and experiment. A scientist can start
at either end - with theory or experiment - but progress usually demands the union of both
a theory to make sense of the experiments and data to verify the theory. Technological
novelties such as computer models are neither here nor there. A really good dynamic
computer model - of the global atmosphere, for example - is like a theory that throws off
data, or data with a built-in theory. It's easy to see why such technological worlds are
regarded with such wariness by science - they seem corrupted coming and going. But in
fact, these models yield a third kind of truth, an experiential synthesis - a parallel exis-
tence, so to speak'' [ 57 ].
The quantitative evaluation of any experiment is benefited by a model that
defines a consistent set of parameters allowing comparisons to be made among
results from different experiments. In practice, the experiments or systems in
question quickly attain a high degree of complexity that is reflected in the models
used to describe them. Intuitive understanding or even analytic solutions of such
models are limited to a few and practically usually irrelevant cases. Computer
technology in combination with numerical mathematics provides the means of
solving complex models under practically relevant boundary conditions. Since the
necessary tools have become widely available and affordable, engineers and
decision makers in the industrial and public sectors have come to rely on quan-
titative computer analyses in order to develop products or assess risks more cost
effectively and quickly by partially eliminating expensive experiments [ 86 ]. Cost-
effectiveness and possible liabilities require thoroughly validated and verified
models with a sufficient quantitative accuracy and their application domain usually
has a high degree of overlap with their validation domain, the only exception being
high-consequence systems where full-scale physical testing is never an option
[ 86 ]. Models in contemporary bioengineering are at a different developmental
stage with different modelling challenges taking precedence. The particular diffi-
culties in the interpretation of experimental data related to cells and biological
tissues are rooted in the large sample variabilities and the nonlinear, multiphasic,
heterogeneous, anisotropic, viscoelastic and often active nature of these tissues as
well as the usually large deformations they undergo. However, computational
models are being used to analyse the mechanical behaviour of biological tissues
leading to the development of a wide range of constitutive models and furthering
our understanding of the structure-function relationships of the main tissues pre-
valent in the musculoskeletal and cardiovascular system.
Besides this tight synergistic coupling of experiment and model analyses,
computer simulations are becoming more wide-spread. In contrast to pure quan-
titative analyses of a known, i.e. experimentally well defined, system, simulations
aim to go a step further. Simulations create a virtual reality in which the modeller
has the opportunity to alter certain aspects of the system and its environment to test
different hypotheses by observing the effect they have on the system. Therefore,
simulations often focus on the time course, i.e. the transient development, of
a system rather than analysing one specific state and involve theories from a
Search WWH ::




Custom Search