Information Technology Reference
In-Depth Information
Schematic representation of the Gamma memory:
Z −1 is a delay while μ is an interpolation weight and
Σ sums the inputs (reprinted from Petrovi´ et al.,
2009; with permission from Elsevier)
Figure 5.3
Depending on the way that signals are sent back in the neural network,
partial or full recurrence exists. Partial recurrence occurs, for example,
when recurrent connections are sent from the hidden layer back to itself.
In fully recurrent networks, the fi nal output (of the network) is sent back
into the network. If connections between the neurons are set up to have
a memory, then the order of the memory says how many time steps the
signal will be delayed by (default value is 1 time step, i.e. −1). Over time,
the network stores long-term memory structures in its feedback (recurrent)
and regular connections, whose weights are adjusted during training
(Samarasinghe, 2006).
The Elman Neural Network (ENN) is considered a special kind of
feed-forward network, which has additional memory neurons and local
feedback (Koker, 2006); therefore it is a simple DNN (Figure 5.4).
Recurrent links are used to provide the network with a dynamic memory
￿
￿
￿
Figure 5.4
Topology of the ENN
Search WWH ::




Custom Search