Global Positioning System Reference
In-Depth Information
2
T
the usual way, so that
is the sum of squares of the
m
separate errors, minimizing this quadratic gives the normal equations
A
T
A
e
=
(
b
−
A
x
)
(
b
−
A
x
)
A
T
b
A
T
A
)
−
1
A
T
b
x
ˆ
=
or
x
ˆ
=
(
(8.23)
and the error vector is
ˆ
=
−
ˆ
.
e
b
A
x
(8.24)
The covariance matrix for the parameters
x
is
ˆ
e
T
ˆ
e
ˆ
2
A
T
A
)
−
1
2
0
x
=
σ
0
(
with
σ
=
n
.
(8.25)
m
−
The linearized observation equation (8.21) can be rewritten in a vector formula-
tion
⎡
⎤
X
i
1
⎣
⎦
−
Y
i
X
k
Y
k
Z
k
−
X
i
,
0
−
Y
i
,
0
−
Z
i
,
0
P
i
k
i
cdt
k
T
i
I
i
+
e
i
.
−
−
−
=
ρ
0
+
+
+
,
k
i
,
0
k
i
,
0
k
i
,
0
ρ
ρ
ρ
Z
i
cdt
i
(8.26)
We rearrange this to resemble the usual formulation of a least-squares problem
A
x
=
b
:
⎡
⎣
⎤
⎦
=
X
i
1
Y
i
X
k
Y
k
Z
k
−
X
i
,
0
−
Y
i
,
0
−
Z
i
,
0
P
i
k
i
cdt
k
T
i
I
i
−
e
i
.
−
−
−
−
ρ
0
+
−
−
i
i
i
,
ρ
ρ
ρ
Z
i
cdt
i
,
0
,
0
,
0
(8.27)
A unique solution cannot be found from a single equation. Let
b
i
P
i
k
i
,
=
−
ρ
0
+
cdt
k
T
i
I
i
e
i
. Then the final solution comes from
−
−
−
⎡
⎣
⎤
⎦
X
1
Y
1
Z
1
−
X
i
,
0
−
Y
i
,
0
−
Z
i
,
0
−
−
−
1
1
i
,
0
1
i
,
0
1
i
,
0
ρ
ρ
ρ
X
2
Y
2
Z
2
−
X
i
,
0
−
Y
i
,
0
−
Z
i
,
0
⎡
⎣
⎤
⎦
=
−
−
−
1
X
i
,
1
2
i
,
0
2
i
,
0
2
i
,
0
ρ
ρ
ρ
Y
i
,
1
X
3
Y
3
Z
3
=
−
X
i
,
0
−
Y
i
,
0
−
Z
i
,
0
−
.
A
x
b
e
(8.28)
−
−
−
1
Z
i
,
1
cdt
i
,
1
3
i
,
0
3
i
,
0
3
i
,
0
ρ
ρ
ρ
.
.
.
.
X
m
Y
m
Z
m
−
X
i
,
0
−
Y
i
,
0
−
Z
i
,
0
−
−
−
1
m
i
,
m
i
,
k
i
,
0
ρ
ρ
ρ
0
0
If
m
Z
i
,
1
. This has to be added to
the approximate receiver position to get the next approximate position:
X
i
,
1
=
≥
4, there is a unique solution:
X
i
,
1
,
Y
i
,
1
,
X
i
,
0
+
X
i
,
1
,
Y
i
,
1
=
Y
i
,
0
+
Y
i
,
1
,
(8.29)
Z
i
,
1
=
Z
i
,
0
+
Z
i
,
1
.
Search WWH ::
Custom Search