Cryptography Reference
In-Depth Information
Let S be a finite set with an implicit distribution on it (usually the uniform distribution).
In an algorithm we write s
S is randomly selected from S according
to the distribution, i.e. s is chosen with probability Pr( s ).
If A,E
S to mean that s
S and Pr( E ) > 0 then the conditional probability is
Pr( A
E )
Pr( E )
Pr( A
|
E )
=
.
If Pr( A
E )
=
Pr( A )Pr( E ) then A and E are independent events (equivalently, if Pr( E ) >
0 then Pr( A
|
E )
=
Pr( A )). If S is the disjoint union E 1
E 2 ∪···∪
E n then Pr( A )
=
i = 1 Pr( A
E i )Pr( E i ).
Let S be a set. A random variable is a function 4 X : S
|
→ R
. Write
X ⊆ R
for the image
of X (our applications will always have
X
either finite or
N
). Then X induces a distribution
x ) is the measure of X 1 (
on
X
, defined for x
X
by Pr( X
=
{
x
}
) (in the case where
= s X 1 ( x ) Pr( s )). Random variables X 1 and X 2 are
independent random variables if Pr( X 1 =
S is finite or countable, Pr( X
=
x )
x 1 and X 2 =
x 2 )
=
Pr( X 1 =
x 1 )Pr( X 2 =
x 2 )
for all x 1 X 1 and x 2 X 2 .
The expectation of a random variable X taking values in a finite or countable set
X ⊆ R > 0 is
E ( X )
=
x Pr( X
=
x ) .
x
X
= n = 0 Pr( X>n ) (this is shown in the proof of Theorem 14.1.1 ).
If
X = N
then E ( X )
X
X
Note that if
countable the expectation only exists
if the sum is convergent. If X 1 and X 2 are random variables on S then E ( X 1 +
is finite then E ( X ) exists, but for
=
X 2 )
+
=
E ( X 1 )
E ( X 1 ) E ( X 2 ).
Example A.14.1 Consider flipping a coin, with probability p of “heads” and probability
1
E ( X 2 ). If X 1 and X 2 are independent then E ( X 1 X 2 )
p of “tails” (where 0 <p< 1). Assume the coin flips are independent events. What is
the expected number of trials until the coin lands “heads”?
Let X be the random variable with values in
N
where Pr( X
=
n ) is the probability that the
p ) n and Pr( X
p ) n 1 p .
first head is on the n th throw. Then Pr( X>n )
=
(1
=
n )
=
(1
. One can check that n = 1 Pr( X
This gives the geometric distribution on
N
=
n )
=
1.
= n = 1 n Pr( X
The expectation of X is E ( X )
=
n ) (the ratio test shows that this sum
= n = 1 n (1
p ) n 1 . Then
is absolutely convergent). Write T
p ) n 1
p ) n 1
E ( X )
=
pT
=
T
(1
p ) T
=
n (1
( n
1)(1
n = 1
n = 1
1
p .
p ) n 1
=
1
+
(1
=
n = 2
To define this problem formally one should define the geometric random variable X :
S
, where S is the (uncountable) set of countable length sequences of bits, such
that X ( s 1 s 2 ... ) >n if and only if s 1 =···
→ N
“tails”. This leads to measure-theoretic
technicalities that are beyond the scope of this topic, but which are well-understood in
probability theory.
s n =
4
Technically, a random variable is defined on a probability space, not an arbitrary set, and is a measurable function; we refer to
Woodroofe [ 569 ] for the details.
 
Search WWH ::




Custom Search