Biology Reference
In-Depth Information
from which we would calculate the sums of squares for the interaction term (AB), given
the sums of squares for the two main factors, A and B, as:
SS AB j A ; B
5
SS A 1 B 1 AB
2
SS A 1 B
(9.29)
The error, or residual, sums of squares is then estimated as:
SS error 5
SS Total 2
SS A 1 B 1 AB
(9.30)
When using Type I sums of squares, the order in which factors are entered into the
analysis matters because the estimate for the sums of squares explained by a factor is con-
ditional on the estimate for the sums of squares for the factor(s) already in the model. As a
result, for factor A, there are two feasible estimates for its sums of squares: SS A and SS A j B .
In a three factor model we would have even more options: SS A ,SS A j B and SS A j B,C . Because
the factors are confounded due to the unbalanced design, SS A would typically contain
some contribution from factor B. That contribution could be positive if A and B produce
similar changes in Y, or it could be negative if A and B produce contrasting changes in Y.
The term SS B j A is the contribution of factor B, given that we have removed the sums of
squares due to A, so it is a contingent estimate of the effects of A. The interaction term
may also be altered by the correlation between A and B.
Type 1 sums of squares are sometimes called the “improvement sums of squares”
because they are determined by the improvement in fit of the model caused by adding
each term to the model. A characteristic of Type I sums of squares is that the sums of
squares for all the terms sum to the total sum of squares (SS total ) so Type I sums of squares
yield an additive partitioning of variance, unlike the other two approaches. When comput-
ing the F-ratio, the Type I sums of squares are substituted for the sums of squares com-
puted for a balanced design.
Type II Sums of Squares
This method for computing sums of squares involves computing the sums of squares
for the model, including the factor of interest and all other factors of the same order (e.g.
all other main effects, or all other pairwise interactions or all other three-way interaction
terms), then subtracting the sums of squares for the model lacking the factor of interest
from that sum. So, for this two-factor case, to find the sums of squares for factor A,
we would first compute the sums of squares for the model containing both A and
B (SS A 1 B )as:
Y
5
A
1
B
1 ε
(9.31)
And then compute the sums of squares for the model containing only B (SS B ):
Y
B
1 ε
(9.32)
5
And we would then calculate the sums of squares for A (SS A(II) )as:
SS A 1 B 2
SS B ;
(9.33)
Search WWH ::




Custom Search