Digital Signal Processing Reference
In-Depth Information
x
1
x
x
n
Input layer
2
Radial basis
functions
h
1
h
2
h
n
c
1
c
2
c
n
+
Output layer
F
Figure 6.10
Approximation network.
rons represents a
universal approximator
based on the Stone-Weierstrass
theorem [209]. In essence, every multivariate, nonlinear, and continuous
function can be approximated.
2.
The interpolation network with radial-basis functions has the
best ap-
proximation
property compared to other neural networks, such as the
three-layer perceptron. The sigmoid function does not represent a trans-
lation and rotation-invariant function, as the radial-basis function does.
Thus, every unknown nonlinear function
f
is better approximated by a
choice of coecients than any other choice.
3.
The interpolation problem can be solved even more simply by choosing
radial-basis functions of the same width
σ
i
=
σ
, as shown in [197]:
c
i
g
||
N
x
−
m
i
||
F
(
x
)=
(6.22)
σ
i=1
In other words, Gaussian functions of the same width can approximate
any given function.
Data processing in radial-basis function networks
Radial-basis neural networks implement a hybrid learning algorithm.
They have a combined learning scheme of supervised learning for the out-
put weights and unsupervised learning for radial-basis neurons. The ac-
Search WWH ::
Custom Search