Information Technology Reference
In-Depth Information
Analyze the complexity of the problem:
1. theoretical complexity, in order to determine whether a classical ap-
proach is su cient: intrinsic complexity, number of variables, number
of constraints, type of costs to minimize;
2. practical complexity: computation time required for evaluating a can-
didate solution, constraints on the global resolution time, requirements
on the quality of the solution.
Define how the method is to be used (automatic, semi-automatic with
tunable parameters, etc.) and assess the degree of skill of the users.
Assess precisely the requirements on the quality of the solution; for in-
stance, if the data of the problem to solve are corrupted by noise, it is not
necessary to go as close as possible of the optimum.
Assess the available development time.
If the requirement on the quality of the solution is demanding, and if auto-
matic operation is required, annealing algorithms are powerful. Tabu search
requires a generally longer development time, but can provide in some cases
better results than simulated annealing. Genetic algorithms require a very
long development time and a new tuning of the internal parameters as a func-
tion of the problem data. Recurrent neural networks are more adapted to
mean size problems, where the resolution time is more important than the
requirements in terms of quality of the solution.
To reduce the resolution time while producing good solutions, a hybrid
approach is often the best choice, but at the cost of a development time that
may be important.
To summarize, when facing a combinatorial problem, the comparison of the
performances of the different metaheuristics is di cult and must be performed
accurately and rigorously. A presentation the frequently encountered pitfalls,
and a sound evaluation methodology, are provided in [Barr et al. 1995; Hooker
1995; Rardin et al. 2001].
References
1. Aarts E., Korst J. [1989], Simulated Annealing and Boltzmann Machines - a Sto-
chastic Approach to Combinatorial Optimization and Neural Computing ,John
Wiley & Sons Ed., 1989
2. Aarts E., Lenstra J.K. [1997], Local Search in Combinatorial Optimization ,John
Wiley & Sons Ed., 1997
3. Ackley D.H., Hinton G.E., Sejnowski T.J. [1985], A learning algorithm for Boltz-
mann machines, Cognitive Science , 9, pp 147-169, 1985
4. Asai H., Onodera K., Kamio T., Ninomiya H. [1995], A study of Hopfield neural
networks with external noise, 1995 IEEE International Conference on Neural
Networks Proceedings ,NewYork,Etats-Unis, vol. 4, pp 1584-1589
5. Ayer S.V.B. et al. [1990], A theoretical investigation into the performance of
the Hopfield model, IEEE Transactions on Neural Networks , vol. 1, pp 204-
215, June 1990
Search WWH ::




Custom Search