Graphics Reference
In-Depth Information
a grid (e.g., for images), the initial bandwidth h
()
is chosen as the distance between
neighboring pixels.hebandwidth isincreased atereachiteration byadefaultfactor
c
h
.
d
.
=
An Illustrative Univariate Example
8.3
We use a simple example to illustrate the behavior of the algorithm. he data in the
upper let of Fig.
.
follow a univariate regression model
Y
i
=
θ
(
X
i
)+
ε
i
.
(
.
)
he unknown parameter (i.e., the regression function θ) is piecewise constant, the
errors ε
i
are i.i.d. N
,
,andtheobservedX
i
i form a univariate grid. In this
(
)
=
situation the statistical penalty takes the form
(
k
−)
i
σ
λ
N
θ
θ
(
k
)
ij
(
k
−)
i
(
k
−)
j
s
=
(
−
)
(
.
)
where σ
denotes the variance of the errors. A robust estimate of the variance is
obtained from the data using the interquartile range (IQR) as
=
σ
=(
IQR
(
Y
i
+
−
Y
i
)
.
)
(
.
)
i
=
,...,n
−
andthisisusedasaplug-inforσ
.hepropagationcondition(
.
)suggestsavalue
of λ
, disabling the adaptive control step.
We have four regions, differing in size and contrast, where the function θ is con-
stant. heregression function isdisplayedasablacklineinthe upperright ofFig.
.
.
helower part of Fig.
.
illustrates the evolution of the weights w
ij
as the number
of iterations increases. he horizontal and vertical axes correspond to indices i and
j, respectively. he upper row provides K
loc
=
.
. We employ a value of τ
=
(
k
)
ij
(
l
)
for iterations k
=
(h
=
), k
=
(h
=
), k
=
(h
=
)and k
=
(h
=
).hecentralrowshowsthecorresponding
(
k
)
ij
values K
stat
(
s
)
. he grayscale ranges from black for
to white for
. he weights
(
k
)
ij
(lower row) used in the algorithm are the products of both terms.
heletcolumncorrespondstotheinitializationstep.Herethelocationpenaltyef-
fectivelyrestrictsthelocalmodelin X
i
tothepoint X
i
itself.Ifcomputed,thestochas-
tic penalty would contain some weak information about the structure of the regres-
sion function. When we reach step k
w
, the location penalty allows for positive
weights for up to
observations, and therefore less variable estimates. At this stage
the test (
.
) shows a significant difference between estimates at points within the
third homogeneous interval and estimates at locations outside this interval. his is
reflected in the statistical penalty andtherefore the weights. In step k
=
the second
interval is also clearly identified. he last column, referring to the
rd iteration and
a final bandwidth of h
=
=
, shows the final situation, where the statistical penalty