Information Technology Reference
In-Depth Information
The most evident difference between omni-aiNet and DT omni-optimizer is
that the latter works with a population of fixed size, while omni-aiNet can ad-
just the number of individuals according to each problem and to the suppression
threshold defined by the user. This characteristic gives more flexibility to the
search engine, once the algorithm automatically adjusts the number of individ-
uals, providing a better allocation of computational resources.
Although both algorithms use the polynomial mutation as one of the mech-
anisms of genetic variability, the probability of activation of this mechanism is
much smaller in the DT omni-optimizer than in omni-aiNet, once the latter
presents a polynomial hypermutation as its main mechanism of genetic vari-
ability. Also, omni-aiNet automatically determines the parameter η according
to the ranking of each individual, while in the DT algorithm this parameter is
defined by the user. The other mechanisms of genetic variability are also dif-
ferent in both algorithms: DT presents crossover between the individuals in the
population, while omni-aiNet presents gene duplication.
The last main difference between both algorithms is associated with the way
each omni-optimizer treats the diversity and spacing of solution in both variable
and objective spaces. While omni-aiNet presents the mechanisms of Suppression,
Random Insertion and Grid, described in Sections 4.2 to 4.4, the DT algorithm
uses a metric of Crowding Distance to select the individuals with greater dis-
tances from their neighbours in variable and objective space (further information
about this metric and procedure can be found in [8]).
Both algorithms present the same number of parameters to be adjusted by
the user: omni-aiNet demands the proper tuning of the size of initial population,
number of generations, number of generations between suppressions, number of
clones per individual, number of randomly generated individuals, suppression
threshold and δ ; while the DT omni-optimizer requires the definition of the size
of initial population, number of generations, distribution index for crossover,
probability of crossover, distribution index for mutation, probability of mutation
and δ . More information about these parameters can be found in [8].
6
Experimental Results
This section presents the results of the preliminary experiments with the omni-
aiNet algorithm. Special attention will be devoted to multi-objective problems,
so that single objective instances (uni and multi-global) are incorporated only
to indicate the ability to perform omni-optimization.
For the multi-objective problems, the omni-aiNet algorithm was compared to
the original version of the omni-optimizer algorithm, proposed by Deb and Tiwari
[8] (DT omni-optimizer) and kindly provided by the authors, and the compar-
ative results are presented in Subsections 6.3 and 6.4. Once Deb and Tiwari's
software package does not provide the number of fitness evaluations per itera-
tion, the comparison will be founded on the capability to reproduce the Pareto
front, and whenever an equivalence exists between parameters, they will receive
the same settings. For the non-equivalent parameters, the DT omni-optimizer
Search WWH ::




Custom Search