Cryptography Reference
In-Depth Information
granted in 1940.
In 1941, Shannon obtained a job as a research mathematician at the New
JerseyAT&T Bell Labs. In 1942, he collaborated with John Riordan to publish
a paper whose topic was the number of two-terminal series-parallel networks,
which generalized seminal results published byMacMahon in 1892. By1948
he had published the paper cited at the outset of this section, which essentially
founded Information Theory. In this paper, he set forth a linear schematic
model of an information system, a revolutionary new idea that described the
measurement of information via binarydigits. In this paper, he uses the word
“bit” for the first time. At that time communication was via nondigital means,
the transmission of electromagnetic waves through a wire. What we take for
granted today— continuous flow of bits through a wire — was a revolutionary
idea at that time. In this paper, he provides a rigorous mathematical definition
of information, which was based upon his cryptological work accomplished dur-
ing World War II. Shannon's assumption was that information sources generate
words comprised of a finite number of symbols sent over a channel. He provided
a mechanism for analyzing a sequence of error terms in a signal to determine
their inherent type, assigning them to the designed type of the control system.
He demonstrated how adding extra bits to a signal could correct transmission
errors. This and the notions exposed in this paper were used and extended by
engineers and mathematicians to provide eLcient and error-free transmissions
through noisychannels. The development of Information Theorymade possible
the development of digital systems.
Shannon also worked on AI problems. By1950, he had written a computer
program, which appeared in a publication entitled Programming a Computer for
Playing Chess . This publication led to the first machine chess game played by
the Los Alamos MANIAC machine in 1956. This was also the year in which he
published an important paper that demonstrated how a Turing machine can be
constructed utilizing onlytwo states. In 1957, he was appointed to the Faculty
at MIT, but remained a consultant to AT&T Bell Labs until 1972. He received
manyawards, among them the Alfred Nobel American Institute of American
Engineers Award in 1940, the National Medal of Science in 1966, the Audio
Engineering Society Gold Medal in 1985, and the Kyoto prize in that year. In
his last few years he suffered from Alzheimer's disease, and was confined to a
Massachusetts nursing home. He died at age 84 in Medford, Massachusetts, on
February24, 2001.
Shannon was a genius in his own realm, making contributions that paved the
wayfor our modern digital revolution. Marvin Minskywrote of him that “For
him, the harder the problem might seem, the better the chance to find something
new.” He certainlyfound many“new” concepts, without which we could not
have the world that we have today. He was an inspiration to generations of
mathematicians and computer scientists. One of Shannon's ideas, contained in
his paper [249], was a notion related to his formulation of information. We will
studythis notion in the following section for which the reader will need some
familiaritywith the basic probabilitytheorypresented in Appendix E on page
543.
Search WWH ::




Custom Search