Information Technology Reference
In-Depth Information
by other particles. PSO is an example for the parameter configuration problem. If the
parameters are well chosen, the whole swarm will find an adequate minimum and focus
on this solution. The swarm slows down the velocity trying to get better values in the
continuous search space around this found solution. This exploitation can be on a local
optimum especially if the wrong parameter set is chosen and the swarm cannot escape
from this local minimum. On the other hand the particles can never find the global op-
timum if they are too fast and never focus. This swarm behavior depends mainly on the
chosen parameter and leads to solutions of different quality.
One problem in choosing the right parameters without knowledge about the objec-
tive function is to identify relevant characteristics of the function which can be used
for a comparison among functions. The underlying assumption is that, e.g., a function
f 1 = x 2 and a function f 2 =3 x 2 +2 exhibit similar optimization behavior if the same
parameter set for a Particle Swarm Optimization is used. In order to choose a promising
parameter set, functions must be comparable with respect to certain objective function
characteristics.
We describe an approach to computing features of the objective function by observ-
ing the swarm behavior. For each function we seek for a parameter set that performs
better than the standard configuration and provide this set as output class for supervised
learning. These data allow us to train a C4.5 decision tree [11] as classifier that com-
putes an adequate configuration for the Particle Swarm Optimization by using function
features. Experimental trials show that our decision tree classifies functions into the
correct classes in many cases. This classification can be used to select promising pa-
rameter sets for which the Particle Swarm Optimization is expected to perform better
in comparison to the standard configuration.
This work is structured as follows: In section 2 we describe other approaches point-
ing out the problem of computing good parameter sets for a metaheuristic and explain
the Particle Swarm Optimization. Section 3 describes how to compute the features of
a function and thereby make the functions comparable. After computing the features
we describe our experimental setup and the way to build up the decision tree. Section 4
contains our experimental results for building the parameter classes to select promising
parameter sets in PSO. The last section discusses our results and describes issues for
future work.
2
Parameter Settings in Metaheuristics
The main difference between solving a problem with exact methods or with metaheuris-
tics is the quality of the solution. Metaheuristics - for example, nature inspired meta-
heuristics [2] - have no guarantee of finding the global optimum. They focus on a point
in the multidimensional search space which results to the best fitness value depending
on the experience of the past optimization performance. This can be a local optimum,
too. But the advantage of the metaheuristic is to find an adequate solution of a multidi-
mensional continuous optimization problem in reasonable time [13]. This performance
depends on the configuration of the metaheuristics and is an important fact of using
metaheuristics. One group of metaheuristics are the population based metaheuristics.
[13] defines population-based metaheuristics as nature inspired heuristics which han-
dle more than one solution at a time. With every iteration all solutions are recomputed
 
Search WWH ::




Custom Search