Geoscience Reference
In-Depth Information
To test the regression we compute
1832 9696
19 8240
.
F
=
=
92 46
.
324
/
df
.
which is significant at the 0.01 level.
Often we will want to test individual terms of the regression. In the previous example we might
want to test the hypothesis that the true value of b 3 is zero. This would be equivalent to testing
whether the viable X 3 makes any contribution to the prediction of Y . If we decide a b 3 may be equal
to zero, we might rewrite the equation in terms of X 1 and X 2 . Similarly, we could test the hypothesis
that b 1 and b 3 are both equal to zero.
To test the contribution of any set of the independent variables in the presence of the remaining
variables:
1. Fit all independent variables and compute the reduction and residual sums of squares.
2. Fit a new regression that includes only the variables not being tested. Compute the reduc-
tion due to this regression.
3. The reduction obtained in the first step minus the reduction in the second step is the gain
due to the variables being tested.
4. The mean square for the gain (step 3) is tested against the mean square residual from the
first step.
7.17.2.2 Coefficient of Multiple Determination
As a measure of how well the regression fits the data it is customary to compute the ratio of the
reduction sum of squares to the total sum of squares. This ratio is symbolized by R 2 and is some-
times called the coefficient of determination:
2 = Reduction
Total
SS
R
SS
For the regression of Y on X 1 , X 2 , and X 3 ,
5498 9389
5974 7143
.
.
R 2
=
=
092
.
The R 2 value is usually referred to by saying that a certain percentage (92 in this case) of the varia-
tion in Y is associated with regression. The square root ( R ) of the ratio is the multiple correlation
coefficient.
7.17. 2 . 3 T h e c - M u l t i p l i e r s
Putting confidence limits on a multiple regression requires computation of the Gauss or c-multipli-
ers. The c-multipliers are the elements of the inverse of the matrix of corrected sums of squares and
products as they appear in the normal equations.
7.17.3
C urvilinear r egressions and i interaCtions
7.17. 3 .1 C u r v e s
Many forms of curvilinear relationships can be fitted by the regression methods that have been
described in the previous sections. If the relationship between height and age is assumed to be
hyperbolic so
Search WWH ::




Custom Search