Information Technology Reference
In-Depth Information
Setting Up Particle Swarm Optimization
by Decision Tree Learning Out of Function Features
Tjorben Bogon 1 , Georgios Poursanidis 2 ,
Andreas D. Lattner 2 , and Ingo J. Timm 1
1 Business Information System I, University of Trier, Trier, Germany
2 Information Systems and Simulation, Goethe University Frankfurt
Frankfurt, Germany
bogon@uni-trier.de
Abstract. This work describes an approach for the computation of function fea-
tures out of optimization functions to train a decision tree. This decision tree
is used to identify adequate parameter settings for Particle Swarm Optimization
(PSO). The function features describe different characteristics of the fitness land-
scape of the underlying function. We distinguish between three types of features:
The first type provides a short overview of the whole search space, the second
describes a more detailed view on a specific range of the search space and the
remaining features test an artificial PSO behavior on the function. With these fea-
tures it is possible to classify fitness functions and to identify a parameter set
which leads to an equal or better optimization process compared to the standard
parameter set for Particle Swarm Optimization.
Keywords. Particle swarm optimization, Machine learning, Swarm intelligence,
Parameter configuration, Objective function feature computation.
1
Introduction
Metaheuristics in stochastic local search are used in numerical optimization problems
in high-dimensional spaces. For varying types of mathematical functions, different op-
timization techniques vary w.r.t. the optimization process [16]. A characteristic of these
metaheuristics is the configuration of the parameters [6]. These parameters are essential
for the efficient optimization behavior of the metaheuristic but depend on the objective
function, too. An efficient set of parameters influences the optimization in speed and
performance. If a good parameter set is selected, an adequate solution will be found
faster compared to a bad configuration of the metaheuristic. The choice of the param-
eters is based on the experience of the user and his knowledge about the domain or on
empirical research found in literature. This parameter settings, called standard configu-
rations, perform a not optimal but an adequate optimization behavior for most objective
functions. An example for metaheuristics is the Particle Swarm Optimization (PSO).
PSO is introduced by [5] and is a population-based optimization technique which is
used in continuous high dimensional search spaces. PSO consists of a swarm of par-
ticles which “fly" through the search space and update their position by taking into
account their own best position and depending on the topology, the best position found
 
Search WWH ::




Custom Search