Information Technology Reference
In-Depth Information
parameters that are relevant to the nonlinearity are the translations and the
dilations of the wavelets [Benveniste 1994; Oussar 2000].
1.1.1.4 Recurrent (Feedback) Neural Networks
General Form
The present section is devoted to a presentation of the most general neural
network architecture: recurrent neural networks, whose connection graph ex-
hibits
cycles
. In that graph, there exists at least one path that, following the
connections, leads back to the starting vertex (neuron); such a path is called
a
cycle
. Since the output of a neuron cannot be a function of itself, such an
architecture requires that
time
be explicitly taken into account: the output of
a neuron cannot be a function of itself
at the same instant of time
, but it can
be a function
of its past value(s)
.
At present, the vast majority of neural network applications are imple-
mented as digital systems (either standard computers, or special-purpose
digital circuits for signal processing): therefore,
discrete-time systems
are
the natural framework for investigating recurrent networks, which are de-
scribed mathematically by recurrent equations (hence the name of those net-
works). Discrete-time (or recurrent) equations are discrete-time equivalents of
continuous-time differential equations.
Therefore, each connection of a recurrent neural network is assigned a
delay
(possibly equal to zero), in addition to being assigned a parameter
as in feedforward neural networks. Each delay is an integer multiple of an
elementary time that is considered as a time unit. From causality, a quantity,
at a given time, cannot be a function of itself at the same time: therefore, the
sum of the delays of the edges of a cycle in the graph of connections must be
nonzero.
A
discrete-time recurrent neural network
obeys a set of nonlinear discrete-
time recurrent equations, through the composition of the functions of its neu-
rons, and through the time delays associated to its connections.
Property.
For causality to hold, each cycle of the connection graph must have
at least one connection with a nonzero delay.
Figure 1.5 shows an example of a recurrent neural network. The digits
in the boxes are the delays attached to the connections, expressed as integer
multiples of a time unit (or sampling period)
T
. The network features a cycle,
from neuron 3 back to neuron 3 through neuron 4; since the connection from
4 to 3 has a delay of one time unit, the network is causal.
Further Details
At time
kT
, the inputs of neuron 3 are
u
1
(
kT
)
,u
2
[(
k
1)
T
](where
k
is a positive integer and
y
4
(
kT
) is the output of neuron 4 at time
kT
), and
−
1)
T
]
,y
4
[(
k
−
Search WWH ::
Custom Search