Databases Reference
In-Depth Information
ExampleA.4.1:
Suppose in a class of 10 students the grades on the first test were
10
,
9
,
8
,
8
,
7
,
7
,
7
,
6
,
6
,
2
The average value is 7 10 , or 7. Now let's use the frequency of occurrence approach to estimate
the probabilities of the various grades. (Notice in this case the random variable is an identity
mapping, i.e., X
(ω) = ω
.) The probability estimate of the various values the random variable
can take on is
P
(
10
) =
P
(
9
) =
P
(
2
) =
0
.
1
,
P
(
8
) =
P
(
6
) =
0
.
2
,
P
(
7
) =
0
.
3
,
P
(
6
) =
P
(
5
) =
P
(
4
) =
P
(
3
) =
P
(
1
) =
P
(
0
) =
0
The expected value is therefore given by
[
]= (
)(
) + (
)(
) + (
.
)(
) + (
)(
) + (
)(
) + (
)(
) + (
.
)(
)
E
X
0
0
0
1
0
1
2
0
3
0
4
0
5
0
2
6
+ (
.
)(
) + (
.
)(
) + (
.
)(
) + (
.
)(
) =
0
3
7
0
2
8
0
1
9
0
1
10
7
It seems that the expected value and the average value are exactly the same! But we have
made a rather major assumption about the accuracy of our probability estimate. In general the
relative frequency is not exactly the same as the probability, and the average expected values
are different. To emphasize this difference and similarity, the expected value is sometimes
referred to as the statistical average , while our everyday average value is referred to as the
sample average .
We said at the beginning of this section that we are often interested in things such as signal
power. The average signal power is often defined as the average of the signal squared. If
we say that the random variable is the signal value, then this means that we have to find the
expected value of the square of the random variable. There are two ways of doing this. We
could define a new random variable Y
X 2 , then find f Y (
=
y
)
and use ( A.12 ) to find E
[
Y
]
.An
easier approach is to use the fundamental theorem of expectation , which is
E
[
g
(
X
) ]=
g
(
x i )
P
(
X
=
x i )
(A.13)
i
for the discrete case, and
[
(
) ]=
(
)
f X (
)
(A.14)
E
g
X
g
x
x
dx
−∞
for the continuous case.
The expected value, because of the way it is defined, is a linear operator. That is,
E
[ α
X
+ β
Y
]= α
E
[
X
]+ β
E
[
Y
] ,
α
and
β
are constants
You are invited to verify this for yourself.
There are several functions g
()
whose expectations are used so often that they have been
given special names.
Search WWH ::




Custom Search