Cryptography Reference
In-Depth Information
H ( X 1 ···
X n )=
P X 1 ···X n ( x 1 ,...,x n )log 2 P X 1 ···X n ( x 1 ,...,x n )
( x 1 ,...,x n )
In this equation, P X 1 ···X n refers to the joint probability distribution of
X 1 ,...,X n . Consequently, the joint entropy of X 1 ,...,X n equals the entropy of
the joint probability distribution P X 1 ···X n :
H ( X 1 ···
X n )= H ( P X 1 ···X n )
There is a relation regarding the joint entropy of n random variables X 1 ,...,X n
and their individual entropies. In fact, it can be shown that
H ( X 1 ···
X n )
H ( X 1 )+ ... + H ( X n )
with equality if and only if X 1 ,...,X n are mutually independent.
5.2.2
Conditional Entropy
Equation (5.1) also covers the case where the probability distribution is conditioned
on an event
A
with Pr[
A
] > 0. Consequently,
H ( X
|A
)= H ( P X|A )
=
P X|A ( x )log 2 P X|A ( x )
x∈X : P X |A ( x ) =0
Remember from Section 4.2.3 that P X|A is a regular probability distribution.
Let X and Y be two random variables. If we know the event Y = y ,thenwe
can replace
A
with Y = y and rewrite the formula given above:
H ( X
|
Y = y )= H ( P X|Y = y )
=
P X|Y = y ( x )log 2 P X|Y = y ( x )
x∈X : P X | Y = y ( x ) =0
Search WWH ::




Custom Search