Graphics Reference
In-Depth Information
he kernel function K stat is monotonically nonincreasing over the interval
.
he bandwidth h is increased by a constant factor with each iteration k.hetest
statistic for ( . )
[
,
)
( k )
ij
( k )
i
θ
( k −)
i
, θ
( k −)
j
T
=
N
K(
)
( . )
( k )
ij . his term effectively measures
the statistical difference between the current estimates in X i and X j . In ( . ) the
term
with N i
=
j w ij is used to specify the penalty s
θ, θ
K(
)
denotes the Kullback-Leibler distance of the probability measures P θ
and P θ .
Additionally, we can introduce a kind of memory into the procedure, which en-
sures that the quality of estimation will not be lost with the iterations. his basically
means that we compare a new estimate θ
θ ( k )
( k )
i
=
(
X i
)
with the previous estimate
θ
( k −)
i
( k )
i
in ordertodefine a memoryparameter η i
=
K mem
(
m
)
using a kernel func-
tion K mem and
θ
, θ
( k )
i
τ
( k )
ij
( k )
i
( k −)
i
m
=
K loc
(
l
)K(
)
( . )
j
his leads to an estimate
θ
( k )
i
η i θ
( k )
i
θ
( k −)
i
=
+(
η i
)
( . )
Adaptive Weights Smoothing
8.2.1
We now formally describe the resulting algorithm.
Initialization: Set the initial bandwidth h () , k
andcompute,foreveryi,the
=
statistics
( k )
i
( k )
ij , dS
( k )
i
( k )
ij
N
=
j
w
=
j
w
Y j
( . )
and the estimates
θ
( k )
i
( k )
i
( k )
i
S
N
( . )
=
()
ij
()
ij
()
h
andh ()
using w
=
K loc
(
l
)
.Setk
=
=
c
.
Adaptation: For every pair i, j, compute the penalties
( k )
ij
h ( k ) ,
X i
X j
( . )
l
=
( k )
ij
( k )
ij
( k −)
i
θ
( k −)
i
, θ
( k −)
j
λ T
λ N
s
=
=
K(
)
( . )
( k )
ij as
Now compute the weights w
( k )
ij
( k )
ij
( k )
ij
w
=
K loc
l
K stat
s
( k )
i
( k )
i ,...,w
( k )
in
and specify the local model by W
=
w
.
Search WWH ::




Custom Search