Information Technology Reference
In-Depth Information
to describe by this term two states of affairs. First, we may wish to account
for apparent relationships between elements of a set which would impose
some constraints as to the possible arrangements of the elements of this
system. As the organization of the system grows, more and more of these
relations should become apparent. Second, it seems to me that order has a
relative connotation, rather than an absolute one, namely, with respect to
the maximum disorder the elements of the set may be able to display. This
suggests that it would be convenient if the measure of order would assume
values between zero and unity, accounting in the first case for maximum
disorder and, in the second case, for maximum order. This eliminates the
choice of “neg-entropy” for a measure of order, because neg-entropy always
assumes finite values for systems being in complete disorder. However,
what Shannon 3 has defined as “redundancy” seems to be tailor-made for
describing order as I like to think of it. Using Shannon's definition for
redundancy we have:
H
H m
R
=-
1
(2)
whereby H / H m is the ratio of the entropy H of an information source to the
maximum value, H m , it could have while still restricted to the same symbols.
Shannon calls this ratio the “relative entropy.” Clearly, this expression
fulfills the requirements for a measure of order as I have listed them
before. If the system is in its maximum disorder H = H m , R becomes zero;
while, if the elements of the system are arranged such that, given one
element, the position of all other elements are determined, the entropy—
or the degree of uncertainty—vanishes, and R becomes unity, indicating
perfect order.
What we expect from a self-organizing system is, of course, that, given
some initial value of order in the system, this order is going to increase as
time goes on. With our expression (2) we can at once state the criterion for
a system to be self-organizing, namely, that the rate of change of R should
be positive:
d
d
R
t
> 0
(3)
Differentiating eq. (2) with respect to time and using the inequality (3)
we have:
(
) -
(
)
d
d
R
t
HHt
dd d d
HH t
m
m
(4)
=-
2
H
m
Since H m 2 > 0, under all conditions (unless we start out with systems which
can only be thought of as being always in perfect order: H m = 0), we find
Search WWH ::




Custom Search