Information Technology Reference
In-Depth Information
ists,” etc.: “the whole is greater than the sum of its parts.” As one of my col-
leagues once remarked: “can't the numskulls even add?”
But if a measure function M is introduced, then the holistic sentiment
can be made precise: “The measure of the sum of the parts is not the sum
of the measures of the parts”: M S (T i ) πS M(T i ) , (i = 1,2,3,4...n) . If the
measure function is super-additive, then indeed the holistic motto is justi-
fied. Let us take two parts (a, b) and for our measure function squaring
( 2 . Then in fact (a + b) 2 is greater than (a) 2 + (b) 2 , for a 2 + b 2 + 2ab is
greater than a 2 + b 2 , and indeed by exactly the systemic reciprocity part ab
+ ba , which, by symmetry (commutativity ab = ba ): ab + ba = 2ab .
A first step in the generalization of the measure function permits us to
establish the rules of the game of an algebra of composition, in which
one, as previously, regards the distributive law only as a special case vis-á-
vis operators. If K is some composition (addition, multiplication, logical
implication, etc.), then, just as previously: Op[ K (f,g)] π K [Op(f), Op(g)] .
That is to say, the result of an operation Op on a system constructed via
the K -composition is not equivalent to a system constructed via the K -
composition of the results of the operator Op .
This proposition plays an important role for the autopoieticists, who
indeed always insist that the properties of the autopoietic system cannot be
expressed by the properties of its components.
Now just two cases worth mentioning (a restriction and an extension),
which together allow the interchange of operations and compositions.
(i) Homogeneous composition:
let K be the composition rule,
then
Op[ K (f,g)] = K [Op(f), Op(g)] ;
(ii) Superposition: Let K and C be composition rules, then Op[ K (f,g)] =
C [Op(f), Op(g)] .
This formulation moved the inventors of information theory to follow the
example of Boltzmann and choose the logarithmic function for the entropy
H (here Op ) of a signal source. Since when two sources with signal reper-
toires n 1 , n 2 are composed, the new repertoire is n 1 ¥ n 2 , the new entropy is
simply the sum of the former two: H(n 1 ¥ n 2 ) = H(n 1 ) + H(n) , for log(a · b)
= log(a) + log(b) .
If you consider the “composition” in Figure 8 more closely, you will see
that it is in principle impossible to arrange the x and the u loops on the
paper in such a way that they don't intersect one another. One must raise
either the x or the u off the paper into the “third dimension” in order to
add the two recursions to the system in such a way that they are indepen-
dent of one another. This can be made even clearer if one dispenses
with drawing the external lines, in that one rolls the DS -system into a
cylinder around the u -axis, so that the x -output and y -input edges are
merged.
One can also get rid of the outer u -loop in that one bends the cylinder
into a ring and melds the upper and lower circular ends: this makes u -out
Search WWH ::




Custom Search