Information Technology Reference
In-Depth Information
2.2.3.1 The Mean Is an Unbiased Estimator of the Expectation
Value
Assume that N measurements of a quantity of interest G have been performed,
under conditions that are assumed to be identical. The quantity of interest
is modeled as a random variable whose expectation value γ is unknown. The
result g i of measurement i can be considered as a realization of the random
variable G i . If the experiment has been soundly designed, it can reasonably
be assumed that the result of a given measurement is not affected by, and
does not affect, other measurements: then the random variables G i are mu-
tually independent, and, since the measurements were performed in identical
conditions, they have identical distributions, hence the same expectation γ .
Consider the random variable M =( G 1 + G 2 +
+ G N ) /N . Since the
expectation of a sum of random variables is the sum of the expectations,
one has E M = γ : the expectation value of the random variable M (“mean”)
is equal to the expectation value of G , therefore the mean is an unbiased
estimator of the expectation value. The quantity m =( g 1 + g 2 +
···
+ g N ) /N ,
which is a realization of the estimator of the expectation value of the random
variable G , is an unbiased estimate of the latter.
Consider again the example of the estimation of the temperature of a fluid;
we have shown
···
that the expectation value of the variable T that models the temperature
is equal to the “true” temperature T 0 ,
that the mean is an unbiased estimator of the expectation value.
Therefore, if N temperature measurements are available, the mean of these
measurements is an unbiased estimate of T 0 .
However, the fact that the estimate is unbiased does not tell us anything
about the accuracy of that result. If it is desirable, for instance, to know the
temperature with 10% accuracy, does the estimate comply with that require-
ment? Clearly, the answer depends on the quality of the measurements, i.e.,
on the scattering of the measurements around the true value T 0 . The concept
of variance is useful in that context.
2.2.4 Variance of a Random Variable
The variance of a random variable Y with distribution p Y ( y )is
= σ 2 = +
−∞
E Y ] 2 p ( y ) dy.
var Y
[ y
Hence, the variance is the centered second moment of the distribution.
Search WWH ::




Custom Search