Graphics Reference
In-Depth Information
Figure . . Data points and regression lines in the leaf nodes of the Boston data tree. he blue and
red colors correspond to those in Fig. .
Extension to GUIDE
7.3
he basic GUIDE procedure for fitting piecewise constant and piecewise multiple
linearmodelsisdescribedinLoh( ).Wepresenthereanextensiontofitpiecewise
simple linear models. he same ideas apply to Poisson regression and to piecewise
linear two-predictor models, where the two predictors are chosen at each node via
stepwise regression, subject to the standard F-to-enter and F-to-remove threshold
values of . (Miller, ). Our extension comprises four algorithms, starting with
Algorithm .
Algorithm 1: Tree construction hese steps are applied recursively to each node of
the tree, starting with the root node that holds the whole dataset.
. Let t denote the current node. Fit a simple linear regression to each predictor
variableinthedatain t.Choosethepredictoryieldingthesmallestresidualmean
squared error and record its model R .
. Stop if R
. or if the number of observations is less than n ,wheren is
a small user-specified constant. Otherwise, go to the next step.
. For each observation associated with a positive residual, define the class variable
Z
.
. Use Algorithm to find a variable X to split t into let and right subnodes t L
and t R .
a) If X is ordered, search for a split of the form X
=
; else define Z
=
x.Foreveryx such that
t L and t R contain at least n observations each, find S, the smallest total sum
Search WWH ::




Custom Search