Information Technology Reference
In-Depth Information
Make the Differential Evolution program adaptive , that is, allow algorithm param-
eters themselves to be modified during the course of a run. This might make results
less sensitive to tuning of parameters such as CrossProbability .
Alternatively, develop a better understanding of how to select algorithm parame-
ters in a problem-specific manner. Our experience has been that settings for cross
probability should usually be around .9 (which is quite high as compared to what
is typical for continuous optimization). It would be useful to have a more refined
understanding of this and other tuning issues.
Figure out how to sensibly alter parameters over the course of the algorithm, not by
evolution but rather by some other measure, say iteration count. For example, one
might do well to start of with a fairly even crossover (near 0.5, that is), and have
it either go up toward 1, or drop toward 0, as the algorithm progresses. Obviously
it is not hard to code Differential Evolution to do this. What might be interesting
research is to better understand when and how such progression of algorithm pa-
rameters could improve performance.
Implement a two-level version of Differential Evolution, wherein several short runs
are used to generate initial values for a longer run.
Use Differential Evolution in a hybridized form, say, with intermediate steps of
local improvement. This would involving modifying chromosomes “in plac”, so
that improvements are passed along to subsequent generations. We showed a very
basic version of this but surely there must be improvements to be found.
We remark that some ideas related to item 2 above are explored in [5]. Issues of
self-adaptive tuning of Differential Evolution are discussed in some detail in [1]. A nice
exposition of early efforts along these lines, for genetic algorithms, appears in [3].
References
1. Brest, J., Greiner, S., Boskovic, B., Mernik, M., Zumer, V.: Self-adapting control parame-
ters in differential evolution: a comparative study on numerical benchmark problems. IEEE
Trans. Evol. Comput. 10, 646-657 (2006)
2. Gisvold, K., Moe, J.: A method for nonlinear mixed-integer programming and its application
to design problems. J. ENg. Ind. 94, 353-364 (1972)
3. Goldberg, D.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison-
Wesley Longman Publishing Co., Inc., Boston (1989)
4. Goldberg, D., Lingle, R.: Alleles, loci, and the traveling salesman problem. In: Proceedings
of the 1st International Conference on Genetic Algorithms, pp. 154-159. Lawrence Erlbaum
Associates, Inc., Mahwah (1985)
5. Jacob, C.: Illustrating Evolutionary Computation with Mathematica. Morgan Kaufmann Pub-
lishers Inc., San Francisco (2001)
6. Krasnogor, N., Smith, J.: A tutorial for competent memetic algorithms: Model, taxonomy
and design issues. IEEE Trans. Evol. Comput. 9, 474-488 (2005)
7. Lichtblau, D.: Discrete optimization using Mathematica. In: Proceedings of the World Con-
ference on Systemics, Cybernetics, and Informatics. International Institute of Informatics
and Systemics, vol. 16, pp. 169-174 (2000)
Search WWH ::




Custom Search