Graphics Reference
In-Depth Information
In the case of our “counting heads in a coin-flip” experiment, the expectation is
E [ X ]= p ( hh ) X ( hh )+ p ( ht ) X ( ht )+ p ( th ) X ( th )+ p ( tt ) X ( tt ) ,
(30.7)
= 1
2 + 1
1 + 1
1 + 1
4 ·
4 ·
4 ·
4 ·
0,
(30.8)
= 1.
(30.9)
If we in fact run the coin-flipping program many times, we'll sometimes see a
heads count of 0; sometimes 1; and sometimes 2. The average number of heads,
over many executions of the program, will be about 1.
We can rewrite the expectation in Equation 30.7 by asking, for each possible
value taken on by X (i.e., 0, 1, and 2), what fraction of the items in S correspond
to that value. For the value 1, we have X ( ht )= 1 and X ( th )= 1, so two of the
four items in S , in other words, half, correspond to the value 1. We then sum these
fractions times the associated values, and thus compute
E [ X ]=
r = 0,1,2
r
·
Pr
{
X = r
}
,
(30.10)
1
4 + 1
1
2 + 2
1
4 = 1.
= 0
·
·
·
(30.11)
This latter form is used in many applications of probability, because it depends
only on the values of X and the associated probabilities, and the probability space
S does not appear directly. In our applications, however, this form will not appear
again.
Nonetheless, the function r Pr { X = r }
, and its generalization in the con-
tinuum case, is of substantial interest to us. Note that it's defined on the codomain
of X , not the domain. And for each codomain value r , it tells us (if we're thinking
of X informally as “producing a random output”) the probability that X will pro-
duce the output r . This function is called the probability mass function (or pmf)
for the random variable X , and is denoted p X .
One reason the pmf is interesting is that it can be generalized to apply to a
function Y : S
T , sending our probability space to any finite set T , not just the
real numbers. For such a function, we define
p Y ( t )= Pr
{
Y = t
}
for t
T .
(30.12)
This function, p Y , is a probability mass function on T so that ( T , p Y ) becomes a
probability space.
Inline Exercise 30.3: (a) How do we know that p Y is a probability mass func-
tion, that is, it satisfies the two requirements of non-negativity and normality?
(b) If E
T is an event, show that Equation 30.12 implies that the probability
of the event E in the space ( T , p Y ) is given by Pr T {
Y 1 ( E )
E
}
= Pr S {
}
.(Here
Y 1 ( E ) denotes the set of all points s
S such that Y ( s )
E .)
We've now used the term “probability mass function” in two different ways:
When we spoke of a probability space ( S , p ) , we called p the probability mass
function. But we've also described the pmf for a random variable X : S
R ,
or even for a function Y : S
T into an arbitrary set. We'll now show that
 
 
Search WWH ::




Custom Search