Civil Engineering Reference
In-Depth Information
(
, λ )
We choose
h
by minimizing the cross-validation function given by
n
i = 1 ( Y i
n 1
2
CV
=(
h
, λ )=
g i (
X i ))
ω (
X i ) ,
(6)
where
n
j
n
j
i Y j K c
X i
X j )
K d
X i
X j )
i Y j K
(
X i ,
X j )
(
,
(
,
X i )=
=
=
=
g i (
(7)
n
j
n
X i
X j )
X i
X j )
i K
(
X i ,
X j )
j = i K c
(
,
K d
(
,
=
is the leave-one-out kernel estimator of E
and is a weight function
which serves to avoid difficulties caused by dividing by zero or by the slower
convergence rate arising when X i lies near the boundary of the support of X .
When the sample sizes are large enough, Hall et al. ( 2007 ) (Theorem 2.11)
proved the results as follows
(
Y i
|
X i
) , ω (
X i
)
p 0
p
,
,
,
+
,
h s
1
s
p 1
h s
p 1
1
s
p
p 0
p
,
,
,
+
.
λ
1
s
q 1
λ
q 1
1
s
q
s
s
The result of Hall et al. ( 2007 ) provides a good theoretical basis. The paper
attempts to apply this theory to identify decision variables of tax revenue. Compared
to linear regression and stepwise regression, the kernel estimation of nonparameters
gives good effect of model fitting, and more importantly, the method can automat-
ically remove these irrelevant regressors in the model under data driven to explain
the above problems which cannot be explained by the parameters methods. Here we
will identify the decision variables of tax revenues with the nonparametric method
to provide the basis on decision support system of tax revenue.
4
Empirical Results of the Nonparametric Selection
We use gauss kernel function for continuous variables:
exp
2
2 X i
1
2
x
K
(
x
,
X i ,
h
)=
1
/
.
(8)
h
π
X is ,
x s , λ s )
Let l
(
and K
(
x
,
X i ,
h
)
instead by specific expression; the changed model
(3.1) can be given by
2 X i s x s
h s 2
110
i = 1 Y i
9
s = 1 exp
{−
/
} λ
1
10
Y
=
.
(9)
2 X i s x s
h s
2
110
i =
9
s = 1 exp
{−
1
/
} λ 10
1
 
Search WWH ::




Custom Search