Information Technology Reference
In-Depth Information
88. R.R. Hocking. The analysis and selection of variables in linear
regression.
Biometrics , 32:1-49, 1976.
89. R.R. Hocking. Developments
in linear
regression methodology 1959-1982.
Technometrics , 25:219-230, 1983.
90. S.D. Hodges and P.G. Moore. Data uncertainties and least squares regression. Appl.
Stat ., 21:185-195, 1972.
91. P. Huber. Robust Statistics . Wiley, New York, 1981.
92. S. Van Huffel. Analysis of the Total Least Squares Problem and Its Use in Parameter
Estimation . Ph.D. dissertation. Department of Electrical Engineering, Katholieke
Universiteit Leuven, Leuven, Belgium, 1987.
93. S. Van Huffel. On the significance of nongeneric total least squares problems. SIAM
J. Matrix Anal. Appl ., 13(1):20-35, 1992.
94. S. Van Huffel. TLS applications in biomedical signal processing. In S. Van Huf-
fel, ed., Recent Advances in Total Least Squares Techniques and Errors-in-Variables
Modeling . SIAM Proceedings Series. SIAM, Philadelphia, 1997.
95. S. Van Huffel and J. Vandewalle. Subset selection using the total least squares
approach in collinearity problems with errors in the variables. Linear Algebra Appl .,
88-89:695-714, 1987.
96. S. Van Huffel and J. Vandewalle. The partial total least squares algorithm. J. Comput.
Appl. Math ., 21:333-341, 1988.
97. S. Van Huffel and J. Vandewalle. Analysis and properties of the generalized total
least squares problem ax
b when some or all columns of a are subject to errors.
SIAM J. Matrix Anal. Appl ., 10:294-315, 1989.
98. S. Van Huffel and J. Vandewalle. The Total Least Squares Problems: Computational
Aspects and Analysis . Frontiers in Applied Mathematics. SIAM, Philadelphia, 1991.
99. S. Van Huffel, J. Vandewalle, and A. Haegemans. An efficient and reliable algorithm
for computing the singular subspace of a matrix, associated with its smallest singular
values. J. Comput. Appl. Math ., 19:313-330, 1987.
100. S. Van Huffel, J. Vandewalle, M.C. De Roo, and J.L. Willems. Reliable and efficient
deconvolution technique based on total linear least squares for calculating the renal
retention function. Med. Biol. Eng. Comput ., 25:26-33, 1987.
101. R.A. Jacobs. Increased rates of convergence through learning rate adaptation. Neural
Netw ., 1(4):295-307, 1988.
102. C.J. Tsai, N.P. Galatsanos, and A.K. Katsaggelos. Total least squares estimation
of stereo optical flow. Proceedings of the IEEE International Conference on Image
Processing , Vol. II, pp. 622-626, 1998.
103. E.M. Johannson, F.U. Dowla, and D.M. Goodman. Backpropagation learning for
multilayer feedforward neural networks using the conjugate gradient method. Int. J.
Neural Syst ., 2(4):291-301, 1992.
104. J.H. Justice and A.A. Vassiliou. Diffraction tomography for geophysical monitoring
of hydrocarbon reservoirs. Proc. IEEE , 78:711-722, 1990.
105. G. Kelly. The influence function in the errors in variables problem. Ann. Stat .,
12:87-100, 1984.
106. R.H. Ketellapper. On estimating parameters in a simple linear errors-in-variables
model. Technometrics , 25:43-47, 1983.
=
Search WWH ::




Custom Search