Global Positioning System Reference
In-Depth Information
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
x T Ax
φ =
(A.68)
is, according to (A.67),
2 x T A d x
d
φ =
(A.69)
Th e gradient of
φ
with respect to x is
T
φ
x
φ
∂x 1
φ
∂x u
···
=
2 Ax
(A.70)
Eq uation (A.70) can be readily verified by computing the partial derivatives
φ
/∂x t
at the t th component,
#
&
[35
k
k
x T Ax
∂x t
∂x i
%
(
=
x i x j a ij
j = 1
i = 1
Lin
0.7
——
No
*PgE
(A.71)
k
k
k
=
x j a tj +
x i a it =
2
x j a tj
j
= 1
i
= 1
j
= 1
=
[2 Ax ] t
because A is symmetric.
Equation (A.70) is the foundation for deriving least-squares solutions, which re-
quires locating the stationary point (minimum) for a quadratic function. The proce-
dure is to take the partial derivatives with respect to all variables and equate them
to zero. While the details of the least-squares derivations are given in Chapter 4, the
following example serves to demonstrate the principle of minimization using matrix
notation.
Let B denote an n
[35
×
u rectangular matrix with n>u,
is an n
×
1 vector, and
P an n
×
n symmetric weight matrix that can include the special case P
=
I . The
elements of B ,
, and P are constants. The least-squares solution of
v
=
Bx
+
(A.72)
v T Pv
requires
φ
( x )
=
min. First, we compute the gradient (column vector)
v T Pv
x
x ( Bx
)
) T P ( Bx
=
+
+
x 2
(A.73)
T PBx
x T B T PBx
T P
=
+
+
2 B T PBx
2 B T P
=
+
and equate it to zero,
 
Search WWH ::




Custom Search