Biomedical Engineering Reference
In-Depth Information
and matrices P, Q , and S are given by the singular value decomposition (SVD) of D Φ [ 13 ]
Φ =
Q S
0
0 0
P T
D
(4.8)
s
where P , Q are orthogonal matrices, and S
assuming the rank of D Φ is r .
= diag{
s
,
s
, ...,
s r
}
i
1
2
denotes the singular values of D Φ and assumed s
.
³ ³ ³ >
s
s r
0
1
2
Remarks:
The solution ( 4.6 ) includes ( 4.5 ) as special case, and together can be expressed as
W R p
(4.9)
+
Φ Φ
=
In an ideal world, the designer's job is finished with ( 4.9 ), but in the real world more cau-
tion should be taken. The fundamental issues are the finite precision of our computers and the
noise—universal thermal noise, measurement errors, modeling uncertainty, etc. For example, when
the modeling uncertainty v ( n ) in ( 4.1 ) is white with zero mean and constant variance
σ , the co-
variance matrix of the LS estimate ( 4.6 ) is [ 14 ]
2
P S
0
ˆ ]
cov[
W
=
σ 2
P T
(4.10)
0
0
If some singular values are too small, the variance of the estimate is pessimistically large. On
the other hand, too small singular values also cause catastrophic numerical errors because of the fi-
nite precision [ 13 ]. The typical symptoms of this type of ill-posedness are large norm solutions and
extreme sensitivity to small changes in training data.
The Tikhonov regularization is widely used to address this problem. Because small norm
solutions are desirable, a regularization term is introduced in the LS cost function which penalizes
the solution norm
2
= −
2
(4.11)
N
min
R W H Z
,
y D W
+
λ
W
reg
2
Φ
F
W
In the Bayesian interpretation, the error square term is the likelihood and the regularized
term is some priori knowledge on the norm of the solution, which corresponds to a Gaussian prior
for the L 2 norm [ 14 ].
Theorem 4.3:
(Tikhonov regularization [ 9 ]) The Tikhonov regularized LS solution is
W
(4.12)
1
T
TR =
(
D D
T
+
λ
I
) D y
Φ Φ
Φ
 
Search WWH ::




Custom Search