Global Positioning System Reference
In-Depth Information
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
relatively easily and generally in software. One does not have to identify independent
loop closures, etc.
Figure 4.2 indicates some of the outcomes from the adjustment. Statistical tests
are available to verify the acceptance of the adjustment or aid in discovering and
removing blunders. The adjustment provides probability regions for the estimated
parameters and allows variance-covariance propagation to determine functions of the
estimated parameters and the respective standard deviations. Of particular interest is
the ability of the least-squares adjustment to perform internal and external reliability
analysis, in order to quantify marginally detectable blunders and to determine their
potential influence on the estimated parameters.
Statistical concepts enter the least-squares adjustment in two distinct ways. The
actual least-squares solution merely requires the existence of the variance-covariance
matrix; there is no need to specify a particular distribution for the observations. If
statistical tests are required, then the distribution of the observations must be known.
In most cases, one indeed desires to carry out some statistical testing.
[99
Lin
—
-
——
No
PgE
4.
3 VARIANCE-COVARIANCE PROPAGATION
Th
e purpose of variance-covariance propagation is to compute the variances and
co
variances of linear functions of random variables. Nonlinear functions must first be
lin
earized. Variance-covariance propagation is applicable to single random variables
or
to vectors of random variables.
Pr
obability Density and Accumulative Probability
For
f(x)
to be a proba-
bi
lity function of the random variable
[99
x
, it has to fulfill certain conditions. First,
f(x)
m
ust be a nonnegative function, because there is
always
an outcome of an experiment;
i.e
., the observation can be positive, negative, or even zero. Second, the probability
th
at a sample (observation) is one of all possible outcomes should be 1. Thus the
de
nsity function
f(x)
must fulfill the following conditions:
˜
f(x)
≥
0
(4.8)
∞
f(x)dx
=
1
(4.9)
−∞
The integration is taken over the whole range (population) of the random variable.
Conditions (4.8) and (4.9) imply that the density function is zero at minus infinity
and plus infinity. The probability
x
P(
x
˜
≤
x)
=
F(x)
=
f(t)dt
(4.10)
−∞
is called the cumulative distribution function. It is a nondecreasing function because
of condition (4.8).