Information Technology Reference
In-Depth Information
state, i.e., the vector whose components are the (binary) states of the neurons
of the network, can be considered as the binary code of a piece of information.
Moreover, it can be shown that there exists a function, called the Liapunov
function (or energy function), which always decreases during the spontaneous
evolution of the state of the network; hence the stable states are the minima
of the Liapunov function.
Now consider the inverse problem: in a combinatorial optimization prob-
lem, it is desired to find the minimum (or at least a good minimum) of a
function (cost function) of binary variables. If there exists a recurrent neural
network whose Liapunov function is identical to the cost function of the op-
timization problem, then the fixed points of the spontaneous dynamics of
the recurrent neural network are solutions of the combinatorial optimization
problem. If such a network can be constructed, then it will find a solution of
the problem by evolving, under its spontaneous dynamics, from an arbitrary
initial state.
Therefore, the resolution of a combinatorial optimization problem with a
recurrent neural network requires
finding a recurrent neural network whose energy function is identical to
the cost function of the optimization problem,
finding the parameters of that network,
controlling the dynamics of the network so as to make sure that it will
evolve to reach a good minimum of the cost function, for instance, by
taking advantage of stochastic methods such as simulated annealing.
This powerful technique, together with some of its applications, will be de-
scribed in Chap. 8 of the present topic.
1.2 When and How to Use Neural Networks with
Supervised Training
In the previous sections, we presented the theoretical arguments that support
the use of neural networks in modeling applications. In the present section,
we attack the practical questions raised by the design and training of a neural
model. First, we will explain when neural networks can advantageously be
used—and when they should not be used. In the subsequent section, we will
emphasize how to use neural networks. An in-depth treatment of these im-
portant questions will be given in the next chapters.
1.2.1 When to Use Neural Networks?
We have shown earlier that the fundamental property of neural networks
with supervised training is the parsimonious approximation property, i.e.,
their ability of approximating any su ciently regular function with arbitrary
accuracy.
Search WWH ::




Custom Search