Information Technology Reference
In-Depth Information
with ecologists and economists elsewhere included in the set. But just as real
brains make few appearances in Design for a Brain , the appearances of real
physiology and so on are notable by their infrequency in An Introduction to
Cybernetics . The truly revealing definition of cybernetics that Ashby gives is on
page 2: cybernetics offers “the framework on which all individual machines
may be ordered, related and understood.” 51
An Introduction to Cybernetics is distinguished from Design for a Brain by one
major stylistic innovation, the introduction of a matrix notation for the trans-
formation of machine states in discrete time steps (in contrast to the continu-
ous time of the equations for a state-determined system). Ontologically, this
highlights for the reader that Ashby's concern is with change in time, and,
indeed, the title of the first substantive chapter, chapter 2, is “Change” (with
subheadings “Transformation” and “Repeated Change”). The new notation is
primarily put to work in an analysis of the regulatory capacity of machines.
“Regulation” is one of the new terms that appeared in Ashby's list of the basic
ideas of cybernetics above, though its meaning is obvious enough. All of the
machines we have discussed thus far—thermostats, servomechanisms, the
homeostat, DAMS—are regulators of various degrees of sophistication, acting
to keep some variables within limits (the temperature in a room, the essential
variables of the body). What Ashby adds to the general discussion of regula-
tion in An Introduction to Cybernetics , and his claim to undying eponymous
fame, is the law of requisite variety, which forms the centerpiece of the topic
and is known to his admirers as Ashby's law. This connects to the other novel
terms in An Introduction to Cybernetics 's list of basic ideas of cybernetics—in-
formation, coding, and noise—and thence to Claude Shannon's foundational
work in information theory (Shannon and Weaver 1963 [1949]). One could, in
fact, take this interest in “information” as definitive of Ashby's mature work.
I have no wish to enter into information theory here; it is a field in its own
right. But I will briefly explain the law of requisite variety. 52
Shannon was concerned with questions of efficiency in sending messages
down communication channels such as telephone lines, and he defined the
quantity of information transmitted in terms of a selection between the total
number of possible messages. This total can be characterized as the variety of
the set of messages. If the set comprised just two possible messages—say, “yes”
or “no” in answer to some question—then getting an answer one way or the
other would count as the transmission of one bit (in the technical sense) of
information in selecting between the two options. In effect, Ashby transposed
information theory from a representational idiom, having to do with mes-
sages and communication, to a performative one, having to do with machines
Search WWH ::




Custom Search