Information Technology Reference
In-Depth Information
8.6.5 Improvements of Hopfield Neural Networks
Numerous studies are in progress to overcome the limitations of the recurrent
neural networks without training for optimization. All of them will not be
mentioned in this chapter, but a few tracks recently explored will be detailed
in the following.
8.6.5.1 Improvements of the Encoding of the Energies to Minimize
To avoid being trapped in high-energy local minima corresponding to unac-
ceptable solutions, it is often fruitful to develop alternative encoding schemes
for the problem. Given a problem, different energy functions can be defined,
each of them being characterized by a more or less complex solution space
to explore. Along those lines, in the case of the TSP, Brandt suggested that,
in order to avoid being trapped in a high-energy local minimum during the
convergence, the following energy should be minimized [Brandt et al. 1988]:
2
1
2
N
N
N
N
E = F c + γ
2
1
y i,j
+
y i,j
.
i =1
j =1
j =1
i =1
With the same goal, Szu, in 1988 [Szu 1988], proposed another energy
function,
2
1
2
N
N
N
N
1
E = F c + a 1 F 1 + a 2 F 2 + a 3
y i,j
+
y i,j
.
i =1
j =1
j =1
i =1
8.6.5.2 Analog Hopfield Networks with Annealing
As explained briefly in previous sections, a first solution consists in varying
the temperature τ i (the inverse of the slope at the origin of the activation
function of neuron i ) in the analog Hopfield networks. As with simulated an-
nealing, it is initialized to a high value, and it decreases during convergence.
At high temperatures, the system behaves like a quasi-linear system, because
the activation functions are quasi-linear over a wide range of potential val-
ues. Therefore, neuron outputs vary between
1 and +1. By decreasing the
temperature, neuron outputs tend to the values
1 or +1, which code for
a solution. During convergence, a critical temperature T c can be observed;
below that temperature, the system starts to freeze, i.e., the neuron outputs
significantly evolve towards +1 or
1[Herault et al. 1989]. That temperature
can be estimated theoretically for some optimization problems [Peterson et
al. 1989]. When it is the case, it is not necessary decrease the temperature
regularly: it is su cient to let the system converge towards an equilibrium at
the critical temperature, and then to quench the system by brining it to a
temperature close to 0. At the end of this second step, all neurons are almost
binary.
Search WWH ::




Custom Search