Information Technology Reference
In-Depth Information
Ta b l e 6 . 1 8 A structured register while-programs for the sum of two numbers
input 1
input 2
while
R 2 > 0
dec 2
inc 1
end
out put 1
and the order of their execution. In this way, program analysis results to be easier
and may be developed in a more reliable manner.
The algorithm underlying the register program of Table 6.18 is the natural way
to perform a sum, according to the finger-counting rule.
6.8
Information, Codes, and Entropy
In 1948 Claude Shannon published a booklet entitled “The mathematical theory of
communication” [212], which is a scientific milestone and the first systematic math-
ematical investigation about a quantitative perspective in the analysis of information.
The starting point of Shannon's approach is a probabilistic perspective based on the
concept of information source , which is constituted by a set D of data with a func-
tion assigning a probability of emission to each element of D . The (probabilistic)
measure of a datum d (with respect to logarithm basis b ) is the logarithm of 1
/
p d
that is
log b p d . The motivation of this definition is based on the idea that the in-
formation conveyed by an event is proportional to its rarity and that the information
conveyed by a pair of independent events is the sum of the information conveyed by
each of them. It is interesting to remark that the same intuition was the basis of a
method elaborated by the Arabian crypto-analysts Al-Kindi (Ninth century A.D.) in
deciphering the codes based on mechanisms of letter substitution. In fact, if in a text
letter “a” is replaced by letter “p”, then you can guess this encoding by counting the
frequency of “p” in a sufficiently long text of the language to which this encoding is
applied.
6.8.1
Shannon's Entropy
The crucial concept of Shannon's theory is the (information) entropy of an (infor-
mation) source defined by:
d D p d log b p d
(6.1)
 
Search WWH ::




Custom Search