Information Technology Reference
In-Depth Information
a
b
Fig. 12.1
( a ) Feed-forward neural network ( b ) Quantum neural network with Hermite basis
functions
Truncation of the series yields in the sum
M
X
S M .x/ D
a k k .x/
(12.2)
kD1
If the coefficients a k are taken to be equal to the generalized Fourier coefficients,
i.e. when a k D c k D R b
a .x/ k .x/ dx , then Eq. ( 12.2 ) is a mean square optimal
approximation of .x/.
Unlike generalized Fourier series, in FNN the basis functions are not necessarily
orthogonal. The hidden units in an FNN usually have the same activation functions
and are often selected as sigmoidal functions or gaussians. A typical feed-forward
neural network consists of n inputs x i ;iD 1;2; ;n, a hidden layer of m neurons
with activation function h W R ! R and a single output unit (see Fig. 12.1 a). The
FNN's output is given by
c j h n
!
n
X
X
.x/ D
w ji x i C b j
(12.3)
jD1
iD1
The root mean square error in the approximation of function .x/ by the FNN
is given by
t
X
N
1
N
. .x k / O .x k // 2
E RMS D
(12.4)
kD1
 
Search WWH ::




Custom Search