Information Technology Reference
In-Depth Information
Fig. 2.2 A possible measure
of uncertainty where the Unit
is probability
=
0.5
Uncertainty
Probability
or
Uncertainty
=−
log 2 ( Probability )
On the other hand, if the digits were not equally probable such as in the string:
22222212222222422222232
then we would have a good chance of guessing that the next digit would be 2. In
the extreme case, if the digit was always 2 and the system was noise-free, our chance
of guessing correctly is certain and no further information is obtainable; that is, the
information provided by each message is zero.
So we can say by extension that there is more information in a string of symbols
where the probability of each symbol is the same than there is in a string of symbols
where the probability of the symbols is not the same.
If we imagine the message is coded so that the significant characteristic of the number
(the symbol) is whether it is even, odd or prime then our choice is reduced to only
three symbols. In this case we would have a better chance of guessing the next symbol
(even, odd or prime) than for guessing one of the ten numbers.
So there is more information ( uncertainty ) in a string of symbols where the choice
of symbols is higher.
Finally, the measure of information should have additive properties with a consistent
interpretation. So if there was a 1/2 chance that the number is odd and then a 1/3
chance that the odd number is prime (say), the probabilities should combine such
that there is 1/6 chance of guessing it to be prime at the start.
Thus if the choice to be made is broken down into two or more choices such that
the final outcome has the same uncertainty, then the information should be the
same.
Search WWH ::




Custom Search