Information Technology Reference
In-Depth Information
4 Biased Mutation for Evolution Strategies
The Gaussian mutation operator of ES exhibits the feature of unbiasedness of its
mutations. But offering a bias to particular directions can be a useful degree of
freedom, in particular when controlled self-adaptively. This chapter introduces our
biased mutation operator (BMO) [77], [78] with its self-adaptive bias control to-
gether with a couple of variants. Among the variants is the descent direction muta-
tion operator (DMO), which is based on the descent direction between the center
of two successive populations. We prove that biased Gaussian mutation is supe-
rior to unbiased mutation on monotone functions as long as the bias points into
the descent direction. Various tests revealed successful results, in particular on
ridge functions and in constrained search domains, but also on some multimodal
problems. In the past, various self-adaptive mutation operators for ES have been
proposed, i.e. for optimization in numerical search domains. They reach from self-
adaptive uncorrelated isotropic Gaussian mutation [113], [131] to the derandom-
ized step size control of the covariance matrix adaptation [54].
This chapter is structured as follows. At first, we recapitulate standard muta-
tion operators and some well-known variants for ES in section 4.1. Afterwards,
section 4.2 introduces the BMO, a self-adaptive BMO for ES, with the coef-
ficient vector ΞΎ and it self-adaptation mechanism. We introduce a number of
variants: the sphere BMO (sBMO) with N bias coecients but only one step
size, a cube BMO (cBMO) with one bias coecient and N step sizes. The notion
of a constant direction of a population on monotone function parts leads to the
idea of adapting the bias according to successive populations' centers: the DMO
adapts the bias according to the descent direction estimated by two successive
populations' centers.
All variants are tested and evaluated on a set of test problems in section 4.5. The
biased mutation achieves the greatest improvements on the tested ridge functions
and on constrained problems. On multimodal problems slight improvements were
achieved.
4.1
Mutation Operators for Evolution Strategies
For a comprehensive introduction to ES, we refer to chapter 2.2 and to Beyer and
Schwefel [18]. Here it is important to keep in mind that an individual consists
 
Search WWH ::




Custom Search