Information Technology Reference
In-Depth Information
x n of a real Hilbert space H , we know that their linear
combinations with real coefficients provide a closed subspace K (a subset of H
closed under the vector operations) of H . Therefore the vector v
Given n vectors x 1 ,
x 2 ,...,
H closest to K
has to minimize the norm
||
−...−
||.
v
c 1 x 1
c 2 x 2
c n x n
According to the projection theorem, the unique minimizing vector x 0
K has to be
orthogonal to K . Therefore, for i
=
1
,
2
,...,
n :
((
v
c 1 x 1
c 2 x 2 −...−
c n x n ) |
x i )=
0
.
This means:
c 1 (
x 1 |
x 1 )+
c 2 (
x 2 |
x 1 )+ ... +
c n (
x n |
x 1 )=(
v
|
x 1 )
(7.16)
c 1 (
x 2 )
.......................................... ... ......
c 1 (
x 1 |
x 2 )+
c 2 (
x 2 |
x 2 )+ ... +
c n (
x n |
x 2 )=(
v
|
x 1 |
x n )+
c 2 (
x 2 |
x n )+ ... +
c n (
x n |
x n )=(
v
|
x n )
The matrix G which is transpose to the matrix of system 7.16 is called Gram matrix
of the n vectors generating the vector space K .
Given a set of m equations with n unknowns, which expresses a linear combina-
tion of n linearly independent vectors of
m for some (unknown) coefficients, there
exists a unique n -vector of these coefficients providing the best approximation to a
given m -vector v . The unicity of this n -vector, and its algebraic form, is determined
by the theorem of Table 7.5, as a natural consequence of the projection theorem
for Hilbert spaces, and the solution is expressed in terms of the Gram matrix of the
vectors.
R
Ta b l e 7 . 5 The Least Square Evaluation of matrix W
Least-Square-Estimate .Let W a m × n matrix ( m > n ) with linearly independent col-
umn vectors and let Wz be the matrix product of W by z R
n . Then, for any m -vector
m , there exists a unique vector z 0 R
n
v R
minimizing || v Wz || (the Euclidean m -
n . Moreover, if W T
dimensional norm) over all z R
denotes the transpose of W ,this
vector z 0 is given by z 0 =( W T W ) 1 W T v .
The existence and unicity of vector z 0 , as the Least-Square-Estimate claims (see
7.5), follows directly from the projection theorem. Moreover, the Gram matrix cor-
responding to the column vectors of matrix W is easily seen to be W T W (see left
sides of Eqs. (7.16)), and W T v corresponds to the right hand vector of equation
 
Search WWH ::




Custom Search