Information Technology Reference
In-Depth Information
Usually, a few computer program lines are needed to define the algorithm and the
search objectives.
Initialization of the particle swarm algorithm starts with the random generation
of particles that at this stage represent the potential problem solutions. Through the
search for a final optimal problem solution the initial solutions will be improved by
updating the values of each particle generation, and this will be performed without
using the evolutionary operators such as crossover and mutation. During the search
process, the particles fly through the solution space towards the current pbest,
changing their velocity after each evaluation step.
It is interesting to add that the concept of particle swarm optimization was
worked out by its inventors through the observation of bird flocking and fish
schooling behaviour, and in the attempt to simulate birds seeking food through
social cooperation of neighbouring birds.
Presently, the main application of particle swarm optimization is in solving the
constrained optimization problems, such as optimization of nonlinear functions
(Hu and Eberhart 2002a), multiobjective optimization (Hu and Eberhart, 2002b),
dynamic tracking, etc . He et al. (1998) have even shown a way how to extract the
rules from fuzzy-neural networks using the particle swarm optimization approach.
In the meantime, the term swarm engineering was also coined (Kazadi, 2000),
dealing with the multi-agent systems .
Finally, some useful information about the development trends in this area of
research can be found in the special issue on particle swarm optimization, IEEE
Transactions on Evolutionary Computation (June, 2004).
10.2 Support Vector Machines
Over the last decade or so, increased attention has been paid to support vector
machines, based on the computational approach termed the principle of structural
risk minimization , formulated by Vapnik (1992). This principle is of fundamental
relevance to statistical learning theory and represents an innovative methodology
for development of neural networks (Vapnik, 1998 and 1995) for applications in
function approximation, regression estimation, and signal processing (Vapnik et
al. , 1996). The applications are also extended to include pattern recognition
(Burges, 1998), and time series forecasting (Cao, 2003) and prediction (Muller et
al. , 1997).
Originally, support vector machines were designed for solving pattern
recognition problems by determining a hyperplane that separates positive and
negative examples, by optimization of the separation margin between them. This is
generally based on the method of structural risk minimization and the theory of
statistical learning, where the error rate of learning of test data is limited by the
training error rate and by the Vapnik-Chervonenkis dimension (Vapnik and
Chervonenkis, 1968).
The fundamental concept of a support vector machine relies on Cover's
theorem (Cover, 1965), which states that the mapping of an input vector x into a
sufficiently high-dimensional space, called a feature space , using a nonlinear
Search WWH ::




Custom Search