Information Technology Reference
In-Depth Information
this because there are fewer preconceptions associated with a web than with the other
more familiar terms. A complex web in some cases would be the same as a “complex
adaptive system” and in other cases it would be indistinguishable from a “complex
graphical network” and we intend to use it for either, both or neither depending on the
context.
The term web spans a spectrum from complex at one end of a continuum to simple
at the other. It is possible to associate simplicity with absolute knowability, so we can
know everything about a simple web, even one that is dynamic. On the other hand,
we associate complexity with disorder, which is to say with limited knowability. So it is
probably not possible to know everything about a complex web. This rather comfortable
separation of the world of webs into the complex and the simple, or the knowable and
the unknowable, breaks down under scrutiny, but is useful for some purposes, which we
intend to explore.
As emphasized by Schrödinger, all the laws in the physical and life sciences are sta-
tistical in nature. This is especially true when dealing with webs. Most physical laws
have imagined that the number of web elements is very large. Consider the example
of a gas in a container under pressure at a given temperature. If the container confines
N molecules of gas then at any moment in time the relations among the thermodynamic
variables such as pressure and volume, given by Bo yle 's law, could be tested and would
be found to be inaccurate by departures of order N so th at the average of any inten-
sive macroscopic variable has fluctuations of order 1
/ N , on the basis of the ideas of
According to Schrödinger [ 28 ]the N rule concerns the degree of inaccuracy to
be expected in any physical law. The simple fact is that the descriptive paradigm of nat-
ural or der, the empirical laws of physics, is inaccurate within a probable error of order
1
Gauss
.
N . The application of these ideas to physical phenomena had led to the view that
the macroscopic laws we observe in natural phenomena, such as Boyle's law, Ohm's
law, Fick's law, and so on, are all consequences of the interaction of a large number of
particles, and therefore can be described by means of statistical physics.
Much has changed in the sixty years since Schrödinger first passed along his insights
regarding the application of physical law to biology [ 28 ]. In particular, the way we
understand the statistics of complex phenomena in the context of the natural and social
sciences as well as in life science is quite diff er ent from what it was then. Herein we
examine evidence for the deviation from his N rule, which suggests that the statistical
basis for Schrödinger's understanding of biology, or complexity in general, has changed.
The normal statistics resulting from the central limit theorem (CLT) have been replaced
by apparently more bizarre statistics indicating that the CLT is no longer applicable
when the phenomena become complex. What has not changed in this time is the intu-
ition that webs in the natural, social and life sciences are complex, or, stated differently,
complex webs can be used to categorize phenomena from all these disciplines.
The words complex and complicated are often used to mean equivalent things. It is
only recently that science has recognized that a process need not be complicated to
generate complexity; that is, complex phenomena need not have complex or compli-
cated dynamical descriptions. In such cases the complicated nature of the phenomena
resides in the way the various scales contributing to the process are interconnected. This
/
Search WWH ::




Custom Search