Graphics Reference
In-Depth Information
A more detailed description of the construction of displays of this type, together
with techniques for more global assessment of the evidence for differences between
surfaces and methods for incorporating correlated data, are provided in Bowman
( ).
Additive Models
9.4
In order to be useful statistical tools, regression models need to be able to incorpo-
ratearbitrary numbersofcovariates. Inprinciple,thelocalfitting approachdescribed
above could be extended to any number of covariates. However, in practice, the per-
formance of suchsimultaneous estimation deteriorates rapidly as the dimensionality
of the problem increases. A more parsimonious and powerful approach is offered
by additive models, developed by Friedman and Stuetzle ( ), Hastie and Tibshi-
rani ( ) and many other authors. hese allow each covariate to contribute to the
model in a nonparametric manner but assume that the effects of these are additive,
so that a model for data
(
x i ,...,x pi , y i
)
; i
=
,...,n
is given by
y i
=
α
+
m
(
x i
)+
...
+
m p
(
x pi
)+
ε i .
hisextendstheusuallinearregressionmodelbyallowingtheeffectsofthecovariates
to benonparametric in shape. To ensure that the modelis identifiable, the constraint
that each component function m j averages to zero across the observed covariate val-
ues can be adopted.
Additive models can be fitted to observed data through the backfitting algorithm,
where the vectors of estimates m j
m j
,..., m j
=(
(
x j
)
(
x jn
))
are updated from iter-
ation r to r
+
as
( r +)
j
( r +)
k
( r )
k
m
=
S j
y
α
k < j
m
k j
m
( . )
hisapplies a smoothing operation, expressed in the smoothing matrix S j for the jth
covariate, to the partial residuals constructed by subtracting the current estimates of
all the other model components from the data vector y. he estimate of the intercept
term α can be held fixed at the sample mean y throughout. he identifiability con-
straint on each component function can be incorporated byadjusting the vectors m j
to have mean zero ater each iteration.
hebackfitting algorithm describedaboveisnottheonlywayinwhichanadditive
model can be fitted to observed data. In particular, Mammen et al. ( ) proposed
a smooth backfitting algorithm which has a number of attractive properties. Nielsen
and Sperlich ( ) give a clear exposition of this approach, with practical proposals
for bandwidth selection.
Hastie and Tibshirani ( ) discuss how the standard errors of the estimates can
also be constructed. Computationally, the end result of the iterative scheme( . )can
be expressed in matrix form as y
p
j = P j
=
Py
=(
P
+
)
y,whereP is filled with
Search WWH ::




Custom Search