Biology Reference
In-Depth Information
X
@
G
ð
guesses
X i Þ
;
G
ð
answers
X i Þ
G
ð
guesses
X i Þþ
ð
answer j
guess j Þ;
;
;
@
guess j
j
(8-35)
and the equations used for the computation now become
X
@
G
ð
guesses
X i Þ
;
ð
answer j
guess j
Þ¼
Y i
G
ð
guesses
X i
Þ:
(8-36)
;
@
guess j
j
As before, there are as many such equations as there are experimental
data points. The index j takes as many values as the number of
model parameters. Equation (8-7), for example, has two parameters, and,
if it were being fit to 100 data points, then Eq. (8-36) would actually
be 100 equations (one for each data point) in two unknowns
(K 21 and K 22 ).
E XERCISE 8-5
Identify matrices P, Y*, and
e
such that Eq. (8-36) can be written in matrix
form as P
e ¼
Y*.
IV. WEIGHTED LEAST-SQUARES CRITERION AND
THE GAUSS-NEWTON METHODS FOR WEIGHTED
LEAST SQUARES
Thus far, we have not tried to account for different measurement errors
in the data points. The most common approach to situations where the
experimental measurements are known with different degrees of
accuracy is to apply a weighted least-squares parameter estimation criterion.
Under this criterion, model parameters are calculated so they minimize
the following weighted sum of squared residuals (WSSR):
0
1
2
X
Y i
G
ð
parameters
X i
Þ
;
@
A
WSSR
¼
SEM i
(8-37)
i
X
X
2
r i ;
¼
½
W i
ð
Y i
G
ð
parameters
X i
ÞÞ
¼
;
i
i
where the weights, W i , are reciprocal to the measurement errors.
Thus, the larger the measurement error for a particular data point, the
smaller the weight W i , and consequently the smaller this point's
contribution to the WSSR. The residuals, r i , are defined to be weighted
differences between the data points and the fitted curve.
Search WWH ::




Custom Search