Information Technology Reference
In-Depth Information
1
TOTAL LEAST SQUARES
PROBLEMS
1.1 INTRODUCTION
The problem of linear parameter estimation gives rise to an overdetermined set
of linear equations Ax b ,where A is the data matrix and b is the observa-
tion vector . In the (classical) least squares (LS) approach there is the underlying
assumption that all errors are confined to the observation vector. This assumption
is often unrealistic: The data matrix is not error-free because of sampling errors,
human errors, modeling errors, and instrument errors. Methods for estimating the
effect of such errors on the LS solution are given in [90] and [177]. The method
of total least squares (TLS) is a technique devised to compensate for data errors.
It was introduced in [74], where it has been solved by using singular value
decomposition (SVD), as pointed out in [76] and more fully in [71]. Geometrical
analysis of SVD brought Staar [176] to the same idea. This method of fitting has
a long history in the statistical literature, where the method is known as orthog-
onal regression or errors-in-variables (EIV) 1 regression. Indeed, the univariate
line-fitting problem had been considered in the nineteenth century [3]. Some
important contributors are Pearson [151], Koopmans [108], Madansky [126],
and York [197]. About 40 years ago, the technique was extended to multivariate
problems and later to multidimensional problems (they deal with more than one
observation vector b ; e.g., [70,175]).
Search WWH ::




Custom Search