Geoscience Reference
In-Depth Information
FIGURE 7. 4
Example of mutation in an EA.
However, introduction of the process of mutation can push EAs towards a global solution.
Through the alteration of one or more parts of the chromosome, mutation introduces diversity into
the selected population that can potentially breed fitter solutions and allow the EA to find the global
solution. Figure 7.4 presents an example of a bitstring that is typical of use in EAs. However, other
data representations require different mutation operators; see Michalewicz (1996) for more details.
In practice, the mutation rate is generally a probability that has been determined by initial exper-
imentation. Too high a mutation rate will introduce too much variability, although this can produce
good results quickly. Too low a rate may be insufficient for breeding a it solution unless the indi-
vidual is already well adapted.
7.4.6 S ingle - VerSuS M ultiPle -o BjectiVe o PtiMiSation
As mentioned previously, EAs are often used to calibrate models by searching for optimal model
parameters. If the model performance is determined through a single goodness of fit measure, then
these are called single-objective optimisation problems. However, in many situations, a single per-
formance measure is not sufficient, and model performance is judged by multiple criteria which
may reflect conflicting objectives. These are termed multi-objective optimisation problems, and there
is no unique solution that simultaneously optimises all objectives. Thus, the resulting outcome is
a set of optimal solutions that have varying degrees of trade-off between the different objectives.
Graphically, these optimal solutions lie on a curve called the Pareto-optimal front (Deb 2009). These
are also referred to as non-dominated solutions because all of the solutions on the front are equally
optimal. One of the most common ways of handling multi-objective optimisation is to apply a weight
to each individual objective function and to then combine them in an additive approach thereby
transforming the problem into a single-objective optimisation problem. More information on solving
multi-objective optimisation problems can be found in Abraham et al. (2006) and Deb (2009).
7.5 EA RESOURCES AND SOFTWARE
There are large amounts of information available via online resources, academic journals and ref-
erence topics (see the reference list) for learning more about EAs and for writing your own code
or high-level programs. A basic introduction to GAs with interactive Java applets can be found at
http://www.obitko.com/tutorials/genetic-algorithms/ (Obitko 1998), while a field guide to GP, which
contains useful information and links, can be found at http://dces.essex.ac.uk/staff/rpoli/gp-field-
guide/ (Poli et al. 2008). There are freely available off-the-shelf packages such as the Java Genetic
Algorithm platform (JGAP) (Meffert and Rotstan 2012), or for those interested in programming, a
useful starting point is http://sourceforge.net/directory/development/algorithms/genetic-algorithms/
(SOURCEFORGE 2014), which provides a list of programs in different coding environments. Open-
source Java software for assessing EA algorithms with a focus on data mining can be found at http://
www.keel.es<http://www.keel.es/ (KEEL 2004), while an object-orientated platform (in Java) for
implementing EA algorithms can be found at http://watchmaker.uncommons.org<http://watchmaker.
uncommons.org/ (Dyer 2006). If you are fluent in MATLAB, a GA toolbox has been developed by
the University of Sheffield - http://codem.group.shef.ac.uk/index.php/ga-toolbox (CoDeM 2013) -
while there is also a GA package available for R (Scrucca 2013). Finally, the JNEAT (NeuroEvolution
of Augmenting Topologies) modelling platform is freely available to download from http://www.
cs.utexas.edu/users/ai-lab/?jneat (Stanley and Vierucci 2002), which can be used to evolve neural
networks using EAs.
Search WWH ::




Custom Search