Cryptography Reference
In-Depth Information
Z )=0if and only if X and Y are statistically independent
when given Z . Furthermore, the conditional mutual information between X and Y
is also symmetric, meaning that I ( X ; Y
We h ave I ( X ; Y
|
|
Z )= I ( Y ; X
|
Z ).
?./
?./
?.-/
?.-/
@.A/
?./
Figure 5.3
A Venn diagram graphically representing information-theoretic quantities related to two
random variables.
Let X and Y be two random variables. Then the information-theoretic quan-
tities H ( X ), H ( Y ), H ( XY ), H ( X
X ),and I ( X ; Y ) can be graphically
represented by a Venn diagram, as shown in Figure 5.3.
|
Y ), H ( Y
|
5.3
REDUNDANCY
If L is a natural language with alphabet Σ, then one may be interested in the entropy
per letter, denoted by H L . In the case of the English language, Σ=
{
A,B,...,Z
}
and
=26. If every letter occured with the same probability and was independent
from the other letters, then the entropy per letter would be
|
Σ
|
log 2 26
4 . 70 .
This value represents the absolute rate of the language L and is an upper bound
for H L (i.e., H L
4 . 70). The actual value of H L , however, is smaller, because one
must consider the fact that letters are typically not uniformly distributed, that they
occur with frequencies (that depend on the language), and that they are also not
independent from each other. If X is a random variable that refers to the letters
of the English language (with their specific probabilities), then H ( X ) is an upper
bound for H L :
Search WWH ::




Custom Search