Information Technology Reference
In-Depth Information
FNN that use the eigenstates of the quantum harmonic oscillator (Hermite
basis functions) demonstrate the particle-wave nature of information as described
by Schrödinger's diffusion equation [ 34 , 186 ]. Attempts to enhance connectionist
neural models with quantum mechanics properties can be also found in [ 96 , 135 -
137 ]. The proposed FNNs extend previous results on neural structures compatible
with quantum mechanics postulates, given in [ 149 , 163 , 164 , 168 ].
Through the analysis given in this chapter, it is shown that the input variable x
of the neural network can be described not only by crisp values (particle equivalent)
but also by the normal modes of a wave function (wave equivalent). Since the
basis functions of the proposed FNN are the eigenstates of the quantum harmonic
oscillator, the FNN's output will be the weighted sum .x/ D P iD1 w k k .x/,
where
2 is the probability that the input of the neural network (quantum
particle equivalent) is found between x and x C x. Thus, the weight w k provides
a measure of the probability to find the input on the neural network in the region
associated with the eigenfunction k .x/.
Furthermore, issues related to the uncertainty principle are examined in case
of the QHO-based neural network. An expression of the uncertainty principle for
Hermite basis functions is given. The uncertainty principle is a measure of the
time-frequency localization of the activation functions in the QHO-based neural
network and evaluates the degradation of localization when successive elements of
these orthonormal basis functions are considered. It is shown that the Hermite basis
functions as well as their Fourier transforms cannot be uniformly concentrated in
the time-frequency plane [ 89 , 141 ]. Simulation results support this argument.
j .x/j
12.2
Feed-Forward Neural Networks
FNN serve as powerful computational tools, in a diversity of applications including
function approximation, image compression, and system fault diagnosis. When
equipped with procedures for learning from measurement data they can generate
models of unknown systems. FNN are the most popular artificial neural structures
due to their structural flexibility, good representational capabilities, and availability
of a large number of training algorithms.
The idea of function approximation with the use of FNN comes from generalized
Fourier series. It is known that any function .x/ in an L 2 space can be expanded
in a generalized Fourier series in a given orthonormal basis, i.e.
X
1
.x/ D
c k k .x/; a x b
(12.1)
kD1
Search WWH ::




Custom Search