Information Technology Reference
In-Depth Information
An Elman network (Figure 3.6) is a four-layer network made out of input
layer, hidden layer, output layer and the context layer , the nodes of which are the
one-step delay elements embedded into the local feedback paths. In the network,
the neighbouring layers are interconnected by adjustable weights.
Originally, Elman proposed his simple recurrent network for speech processing.
Nevertheless, owing to its eminent dynamic characteristics the network was widely
accepted for systems identification and control (Sastry et al. , 1994). This was
followed by applications in function approximation and in time series prediction.
Elman Network
w 1 11
w 2 11
X ( t )
y 1
Input
neuron-1
Hidden
neuron-1
Output
neuron-1
:
:
w 2 h1
:
:
:
:
:
:
w 1 n1
w 2 1m
X ( t- ( n-1 ))
y m
w 2 hm
Input
neuron- n
Hidden
neuron- h
Output
neuron- m
Inputs
outputs
Input Layer
Output Layer
Context
unit-1
:
:
:
:
Z -1
Hidden Layer
Context
unit- m
Z -1
Figure 3.6. Configuration of the Elman network
Independently, Hopfield (1982) reported to the US National Academy of
Sciences about neural networks with emergent collective computational abilities.
In his report, Hopfield (1984) presented the neurons with graded response and their
collective computational properties. He also presented some applications in
neurobiology and described an electric circuit that closely reflected the dynamic
behaviour of neurons, which is known as the Hopfield network (see Figure 3.7).
The Hopfield network is a single-layer fully interconnected recurrent network
with a symmetric weight matrix having the elements
w and zero diagonal
elements. As shown in Figure 3.7, the output of each neuron is fed back via a delay
unit to the inputs of all neurons of the layer, except to its own input. This provides
the network with some auto-associative capabilities : the network can store by
learning, following the Hebbian law or the delta rule , a number of prototype
patterns called fixed-point attractors in the locations determined by the weight
matrix. The patterns stored can then be retrieved by associative recalls. On request
to recall any of patterns stored, the network repeatedly feeds the output signals
back to the neuron inputs until it reaches its stable state.
The recall capability of recurrent networks of retaining the past events and of
using them in further computations is the advantage that the feedforward networks
w
ij
ji
Search WWH ::




Custom Search