Digital Signal Processing Reference
In-Depth Information
This means that we would choose the solution of ( 5.7 ) with minimum canonical
Euclidean norm. Then, given a particular solution
w p , we need to search over the
ˆ
vectors
w h having minimum norm. Although
this seems to be difficult, the fact is that the solution to problem ( 5.11 ) is very simple
and is given by
w h
ˆ
N (
C
)
that lead to
w
ˆ
= ˆ
w p + ˆ
C d
w
ˆ
=
,
(5.12)
where C denotes the Moore-Penrose pseudoinverse of C . The pseudoinverse of a
general rectangular matrix C is a generalization of the concept of inverse. Its calcula-
tion in the most general case can be obtained from the singular value decomposition
(SVD) of a matrix C [ 5 ]. The SVD is a very powerful technique to decompose a
general rectangular matrix.
5.1.1.1 Singular Value Decomposition and Pseudoinverse
n
×
L be an arbitrary rectangular matrix. Matrix C can be written as:
∈ R
Let C
If L
n :
U 0
V T
L
×
L
C
=
, ∈ R
.
(5.13)
If L
n :
0 ] V T
n
×
n
C
=
U [
, ∈ R
.
(5.14)
n
×
n and V
L
×
L being orthonormal matrices 6 whose columns are
with U
∈ R
∈ R
the eigenvectors of CC T
and C T C respectively. Square matrix
is diagonal with
positive entries, that is:
=
diag
1 2 ,...,σ K ,
0
,...,
0
) ,
(5.15)
2
where
σ
i ,
i
=
1
,...,
K , with K
=
rank
(
C
)
min
(
n
,
L
)
, are the non-null
eigenvalues of either CC T or C T C .
The SVD is a very important tool in linear algebra, matrix analysis and signal
processing. The reader interested in more details on SVD can see [ 5 , 6 ] and the
references therein. The pseudoinverse C can be written is terms of its SVD as [ 7 ]:
If L
n :
V
0 U T
C
1
=
.
(5.16)
If L
n :
V 1
0
U T
C
=
.
(5.17)
6 U T U = UU T
= I n and V T V = VV T
= I L .
Search WWH ::




Custom Search