Digital Signal Processing Reference
In-Depth Information
Δ
w
i
=
εe
−k
i
(
x
,
w
i
/λ)
(
x
−
w
i
)
i
=1
,
···
,N,
(6.19)
where
i
=1
,
···
,N
. The parameters have the time dependencies
λ
(
t
)=
t
t
max
and
ε
(
t
)=
ε
i
(
ε
f
/ε
i
)
t
λ
i
(
λ
f
/λ
i
)
t
max
Increment the time parameter
t
by 1.
5.
Continuation
: Go to step 2 until the maximum iteration number
t
max
is reached.
6.4
Radial-Basis Neural Networks (RBNN)
Radial-basis neural networks
implement a hybrid learning mechanism.
They are feedforward neural networks with only one hidden layer; their
neurons in the hidden layer are locally tuned; and their responses to an
input vector are the outputs of radial-basis functions. The radial-basis
functions process the distance between the input vector (activation)
and its center (location). The hybrid learning mechanism describes a
combination of an unsupervised adaptation of the radial-basis functions'
parameter and a supervised adaptation of the output weights using a
gradient-based descent method.
The design of a neural network based on radial-basis functions is
equivalent to model nonlinear relationships, and implement an interpo-
lation problem in a high-dimensional space. Thus, learning is equivalent
to determining an interpolating surface which provides a best match to
the training data. To be specific, let us consider a system with
n
inputs
and
m
outputs, and let
,y
m
}
the corresponding output vector describing the system's answer to that
specific input. During the training, the system learns the input and out-
put data distribution, and when this is completed, it is able to find the
correct output for any input. Learning can be described as finding the
“best” approximation function
f
(
x
1
,
{
x
1
,
···
,x
n
}
be an input vector and
{
y
1
,
···
···
,x
n
) of the actual input-output
mapping function [70, 208].
In the following, we will describe the mathematical framework for
solving the approximation problem based on radial-basis neural net-
works. In this context, we will present the concept of interpolation net-
works and how any function can be approximated arbitrarily well, based
on radial-basis functions under some restrictive conditions.
Search WWH ::
Custom Search