Information Technology Reference
In-Depth Information
3 Other Metaheuristics
3.1 Metaheuristics
Heuristic algorithms typically intend to find a good solution to an optimization prob-
lem by 'trial-and-error' in a reasonable amount of computing time. Here 'heuristic'
means to 'find' or 'search' by trials and errors. There is no guarantee to find the best
or optimal solution, though it might find a better or improved solution than an edu-
cated guess. Broadly speaking, heuristic methods are local search methods because
their searches focus on the local variations, and the optimal or best solution can locate
outside of this local region. However, a high-quality feasible solution in the local re-
gion of interest is usually accepted as a good solution in many optimization problems
in practice if time is the major constraint.
Metaheuristic algorithms are advanced heuristic algorithms. Because ' meta -'
means 'beyond' or 'higher-level', metaheuristic literally means to find the solution
using higher-level techniques, though certain trial-and-error processes are still used.
Broadly speaking, metaheuristics are considered as higher-level techniques or strate-
gies that intend to combine lower-level techniques and tactics for exploration and ex-
ploitation of the huge solution space. In recent years, the word ' metaheuristics ' refers
to all modern higher-level algorithms, including Evolutionary Algorithms (EA) in-
cluding Genetic Algorithms (GA), Simulated Annealing (SA), Tabu Search (TS), Ant
Colony Optimization (ACO), Particle Swarm Optimization (PSO), Bee Algorithms
(BA), Firefly Algorithms (FA), and certainly Harmony Search [4].
There are two important components in modern metaheuristics: intensification and
diversification. Such terminologies are derived from Tabu search [5]. For an algo-
rithm to be efficient and effective, it must be able to generate a diverse range of solu-
tions including the potentially optimal solutions so as to explore the whole search
space effectively, while it intensifies its search around the neibourhood of an optimal
or nearly optimal solution. In order to do so, every part of the search space must be
accessible though not necessarily visited during the search. Diversification is often in
the form of randomization with a random component attached to a deterministic com-
ponent in order to explore the search space effectively and efficiently, while intensifi-
cation is the exploitation of past solutions so as to select the potentially good solutions
via elitism or use of memory or both [4-6].
Any successful metaheuristic algorithm requires a good balance between these two
important, seemingly opposite, components [6]. If the intensification is too strong,
only a fraction of solution space might be visited, and there is a risk of being trapped
in a local optimum. It is often the case of gradient-based techniques such as Newton-
Raphson method. Meanwhile, if the diversification is too strong, the algorithms con-
verge too slowly because solutions jump around some potentially optimal solutions.
Typically, the solutions start with some randomly generated (or educated guess) solu-
tions, and gradually reduce their diversification while increasing their intensification
at the same time.
Another important feature of modern metaheuristics is that an algorithm is either
trajectory-based or population-based. For example, simulated annealing is a good ex-
ample of trajectory-based algorithm because the path of the active search point (or
agent) forms a Brownian motion-like trajectory with its movement towards some
Search WWH ::




Custom Search