Cryptography Reference
In-Depth Information
The variance of a random variable is useful because it is often easy to compute,
but it still gives rise to sometimes strong estimations on the probability that a random
variable deviates a lot from its expectation.
The value σ [ X ]= Var [ X ] is called the standard deviation of X . In general,
one may expect the value of a random variable X to be in the interval E [ X ]
±
σ [ X ].
If X is a random variable, then
Var [ aX + b ]= a 2 Var [ X ]
for every a, b
. Similarly, if X 1 ,...,X n are pairwise statistically independent
random variables over the same sample space, then
R
Var [ X 1 + ... + X n ]= Var [ X 1 ]+ ... + Var [ X n ] .
For example, let X be again the random variable that counts the number of
heads in a sequence of n independent coin flips (with E [ X ]= n/ 2). Computing the
variance according to the definition given earlier is difficult. If, however, we view
the random variable X as the sum X 1 + ... + X n (where all X i are mutually
independent random variables such that for each i , X i takes the value 1 with
probability 1 / 2 and the value zero with probability 1 / 2), then Var [ X i ]= 4 ,and
hence Var [ X ]= n
1
4
= n
·
4 .
4.2.8
Chebyshev's Inequality
Chebyshev's inequality as specified in Theorem 4.3 can be used to provide an upper
bound for the probability that a random variable X deviates from its expectation
more than a certain threshold value k
R
. To make use of Chebyshev's inequality,
the variance of X must be known.
Theorem 4.3 (Chebyshev's inequality) If X is a random variable, then
Var [ X ]
k 2
Pr[
|
X
E [ X ]
|≥
k ]
holds for every k
R
.
Proof.
E [ X ]) 2
k 2 ]
Pr[
|
X
E [ X ]
|≥
k ]=P X
Search WWH ::




Custom Search