Database Reference
In-Depth Information
2
,
Cx
ðÞ¼x y
;
or
Cx
ðÞ¼x y
;
ð
Þ
j
j
and
2
Φ
ðÞ¼ Pkk
2
with
Pf ¼ ∇
f
or
Pf ¼ Δ
f
,
with
can be chosen
according to cross-validation techniques or to some other principle. Note that we
find exactly this type of formulation in the case
d ¼
2, 3 in many scattered data
approximation methods (see [ADT95, HL92]), where the regularization term is
usually physically motivated.
Now, we assume that we have a basis of
V
given by {
∇
denoting the gradient and
Δ
the Laplace operator. The value
λ
φ
j
(x)}
j¼
∞
. Let also
φ
j
. We then can express a
the constant function be in the span of the functions
function
f
∈
V
as
f ðÞ¼
1
j¼
1
α
j
φ
j
ðÞ
with associated degrees of freedom
α
j
. In the case of a regularization term of the type
ðÞ¼
1
j¼
1
2
j
λ
j
α
Φ
λ
j
}
j¼
∞
is a decreasing positive sequence, it is easy to show that independent
of the function
C
, the solution of the variational problem (
7.1
) has always the form
where {
f ðÞ¼
X
M
j¼
1
α
j
K
x
:
;
x
j
Here,
K
is the symmetric kernel function
K
x
; ðÞ¼
1
j¼
1
λ
j
φ
j
ðÞφ
j
ðÞ
which can be interpreted as the kernel of a
Reproducing Kernel Hilbert Space
(RKHS). In other words, if certain functions
K
(x,x
j
) are used in an approximation
scheme which are centered in the location of the data points x
j
, then the approxi-
mation solution is a finite series and involves only
M
terms. Many approximation
schemes like radial basis functions, additive models, several types of neural net-
works, and support vector machines (SVMs) can be derived by a specific choice of
the regularization operator (see [EPP00, GJP93, GJP95]).