Information Technology Reference
In-Depth Information
states of the network dynamics. To obtain that behavior, Hopfield proposed
that the connection matrix should be equal to the correlation matrix of the
stored items. More precisely, assume that the network has N neurons and
that p items should be encoded and stored. Those items are encoded by bi-
nary vectors ξ i
=( ξ i ). The weight matrix is denoted by w =( w jl ) with
w jl =(1 /p ) i =1 ξ i ξ i
= l and w jj = 0. Note that the connection matrix
is symmetric. That learning rule is a simplistic version of Hebb's rule, which
was first proposed by Hebb to model some biological learning processes. Later
on, other learning rules have been proposed to ensure that any set of vectors
(with a cardinal smaller than N/ 2) or any state sequence can be stored as a
fixed point or as a cycle of the network dynamics.
To conclude, 20 years after J. J. Hopfield's seminal paper, the following
conclusions may be drawn:
if j
From the point of view of biological modeling, the advantage of Hopfield
model is to emphasize the role of dynamics in the cognitive functions of
biological neural networks, and to show the connection between learning
and correlation, which is set by Hebb's rule. Some older models, which
were less popular, already emphasized those points. Later on, more bio-
logically plausible models integrated new properties: information temporal
coding using action potential or spikes, sparseness and asymmetry of the
connections. These new properties outdated the Hopfield model, however
rich and innovative.
As content addressable memories (CAM), the performance of Hopfield
models is rather poor. Improvements have been developed in the eight-
ies (mean-field Hopfield networks with continuous activation functions,
stochastic Hopfield networks and Boltzmann machines). There was a lot
of published literature. Nevertheless, applied research on those topics was
progressively abandoned, especially in the fields of pattern recognition and
error correction.
A close connection was soon established between the Hopfield model and
the simulated annealing algorithm. Simulated annealing was discovered
at about the same period by Kirkpatrick, Gelatt and Vecchi [Kirkpatrick
1983] from statistical physic inspiration. A new research direction origi-
nated from that connection: the application of neural networks to opti-
mization. That approach is the scope of Chap. 8 of this topic.
4.5.4 Canonical Form for Recurrent Networks
The examples of recurrent neural networks that were provided in previ-
ous paragraphs show that those networks are inherently dynamical systems.
Therefore, if those networks are considered as dynamical systems, they are
by input signals and provide output signals. Consequently, it is convenient
to express them in a state-space representation. That state representation
Search WWH ::




Custom Search