Cryptography Reference
In-Depth Information
There are two notions of independence for more than two random variables.
Roughly speaking, pairwise independence requires two arbitrarily chosen random
variables to be independent (see Definition 4.4), whereas mutual independence
requires all random variables to be independent (see Definition 4.5). In either case,
let X 1 ,...,X n be n random variables over the same sample space Ω.
Definition 4.4 (Pairwise independent random variables) X 1 ,...,X n are pair-
wise independent if for every i, j
= j , it holds that the two
random variables X i and X j are independent (i.e., P X i X j ( x i ,x j )= P X i ( x i )
∈{
1 , 2 ,...,n
}
with i
·
P X j ( x j ) ).
Definition 4.5 (Mutually independent random variables) X 1 ,...,X n are mutu-
ally independent if for every subset of indices I
⊆{
1 , 2 ,...,n
}
with I
=
, it holds
that
m
P X I 1 ...X I m ( x I 1 ,...,x I m )= P X I 1 ( x I 1 )
·
...
·
P X I m ( x I m )=
P X I i ( x I i )
i =1
Note that the notion of mutual independence is stronger than the notion of
pairwise independence. In fact, a collection of random variables that is mutually
independent is also pairwise independent, whereas the converse must not be true
(i.e., a collection of random variables can be pairwise independent without being
mutually independent). For example, consider the situation in which two coins are
tossed. The random variable X refers to the result of the first coin, the random
variable Y refers to the result of the second coin, and the random variable Z refers to
the addition modulo 2 of the results of the two coins. Obviously, all random variables
have values of either 0 or 1. Then X , Y ,and Z are pairwise independent, but they
are not mutually independent (because the value of Z is entirely determined by the
values of X and Y ).
Similar to the case with two random variables, one can show that if n random
variables X 1 ,X 2 ,...,X n are mutually independent, then
E [ X 1 ·
X 2 ·
...
·
X n ]= E [ X 1 ]
·
E [ X 2 ]
·
...
·
E [ X n ] .
4.2.6
Markov's Inequality
Markov's inequality as specified in Theorem 4.2 puts into perspective the expecta-
tion of a random variable X and the probability that its value is larger than a certain
threshold value k . We provide the theorem without a proof.
Search WWH ::




Custom Search