Cryptography Reference
In-Depth Information
If p = i =1 p i and q = i =1 q i , then the following equation holds and can
be used:
pH p 1
p ,..., p k
H ([ p 1 ,...,p k ,q 1 ,...,q l ]) = H ([ p, q ])
+
p
qH q 1
q ,..., q l
+
q
We now turn to the problem of characterizing the uncertainty associated with
more than one random variable (associated with the same discrete probability space
or random experiment). This is where the notion of a joint entropy comes into play.
5.2.1
Joint Entropy
First of all, it is important to note that a vector of random variables (associated
with the same discrete probability space or random experiment) can always be
viewed as a single random variable. If, for example, we have two random variables
X and Y with n and m possible outcomes, then X and Y have joint probability
P XY ( x i ,y j )=Pr[ X = x i ,Y = y j ]= p ( x i ,y j )= p ij for i =1 ,...,n and
j =1 ,...,m . The resulting experiment has a total of nm possible outcomes, and
the outcome ( X = x i ,Y = y j ) has probability p ij = p ( x i ,y j ).
Against this background, the joint entropy (or joint uncertainty) of X and Y ,
denoted as H ( XY ),isdefinedasfollows:
n
m
H ( XY )=
p ( x i ,y j )log 2 p ( x i ,y j )
i =1
j =1
More formally, H ( XY ) can be expressed as follows:
H ( XY )=
P XY ( x, y )log 2 P XY ( x, y )
(5.2)
( x,y )
On the right side of (5.2), the index of the sum goes through all possible pairs
( x, y ) with x
∈X
and y
∈Y
, or—equivalently—all ( x i ,y j ) for i =1 ,...,n and
j =1 ,...,m .
Equation (5.2) can be generalized to the joint entropy of more than two random
variables. In fact, the joint entropy of n random variables X 1 ,X 2 ,...,X n can be
expressed as follows:
Search WWH ::




Custom Search