Information Technology Reference
In-Depth Information
message; the telephone at the other end; and the person listening.
One of Shannon's key steps was to separate the technical prob-
lems of delivering a message from any consideration of its semantic
content. This enabled engineers to concentrate on the message
delivery system itself. Shannon's concerns were how to find the
most efficient way of encoding what he called information in a
particular coding system in a noiseless environment and how to deal
with the problem of noise when it occurred. 'Noise' was Shannon's
term for the elements of a signal that are extraneous to the message
being transmitted. He adopted the term 'entropy' from thermo-
dynamics to refer to the measure of a communication system's
efficiency in transmitting a signal, which was computed on the basis
of the statistical properties of the message source. Through these
means Shannon developed a successful general theory for mathe-
matically calculating the efficiency of a communications system
that was applicable to both analogue and digital systems.
After the War Shannon's theory was of great use in the burgeon-
ing development of binary digital computers, in the expansion and
technological advance of telecommunications, telegraphy, radio,
and television, as well as in servo-mechanical devices using feed-
back signals, and in computers, for which his emphasis on binary
logic made the application of his ideas particularly appropriate.
The concept of redundancy was of great help in building efficient
communications systems intended to operate in 'noisy' conditions,
using 'redundant check bits', elements of data designed to enable
double-checking of message transmission. For engineers he pre-
sented an abstract model for the successful technical transmission
and reception of information. Shannon was reluctant to use the
word information in his paper, knowing that it would cause confu-
sion about the purely technical nature and potential applications
of his ideas. His reluctance was justified as Information Theory
began to be applied in areas outside electrical engineering, including
cognition, biology, linguistics, psychology, economics and physics.
Search WWH ::




Custom Search