Geology Reference
In-Depth Information
value. In these algorithms, the design space should
be converted to the genetic space; therefore genetic
algorithms wok with series of coded variables.
The advantage of working with coded variables
is that the codes have basically capability to con-
vert continuous space to discrete space. Another
interesting point is that the principles of GA are
based on the random processing, so the random op-
erators investigate the search space comparative.
The main steps of operation of genetic algorithms
are: initializations, selection of chromosomes for
reproduction, cross over between chromosomes
and producing the next generation, mutation for
search other parts of the problem (to prevent of
early convergence), insertion of children in the
new population. In recent years, the application
of genetic algorithm is increasing with specify
more and more capability, flexibility and speed
of this algorithm.
The main purpose in single-objective optimi-
zation problems is to find the values of design
parameters at which the value of one objective
function is optimum. While, in multi-objective
optimization (which is also called the vector
optimization), the problem is to find the opti-
mum value of more than one objective function,
which are usually in conflict with each other in
engineering optimization problems, such that
improvement of one leads to the worsening of the
others. Therefore, multi objective optimization
offers the optimal set of solutions, rather than an
optimal response. In this set we cannot find any
answer which dominants the others. The optimal
solutions are called Pareto points or Pareto Front
(Atashkari et al., 2005).
Routine methods in solving multi-objective
optimization problems are conversion of the
multiple objective functions into one objective
function. For this purpose, different methods are
presented in scientific reports, from which the
most widely used methods are: Weighted sum
approach, ε-perturbation, Min-Max and non-
sorting genetic algorithm. Genetic algorithms act
well to solve the multi-objective optimization
problems. In recent years, Srinivas and Deb (1994)
found a new algorithm based on genetic algorithms
for solving multi objective optimization problems.
This method that is called non-sorting genetic
algorithm or NSGA is more powerful than the
previous algorithms in multi-objective optimiza-
tion. Because of the difficulties exist in this
method in solving optimization problems, the
modified algorithm called NSGA-ΙΙ was intro-
duced by Deb a few years later, which acts better
and faster to find the non-sorting solutions (Deb
et al., 2002). In multi-objective optimization, it
is tried to find a design vector X
=
T
x x
,
,
,
x n
1
2
which can optimize k objective functions, f i ,
under m inequality constraints and p equality
constraints, consequently. The multi objective
optimization can be briefly expressed as:
*
find
optimize
X
F X
( )
(
)
( )
g
X
X
0
i
=
1 2
,
, ...,
m
i
subject to
(
)
( ) =
h
0
j
=
1 2, ..., p
,
j
(1)
where X ∈ ℜ n is the design variables vector;
F X
T
( ) = ( )
( )
( )
is the vector
of objective functions so that F ( ) ∈ ℜ k ; and
g i ( ) and h j ( ) are inequality and equality
constraints, respectively.
In optimization studies that include multi-
objective optimization problems, the main objec-
tive is to find the global Pareto optimal solutions,
representing the best possible objective values.
However, in practice, users may not always be
interested in finding the global best solutions,
particularly if these solutions are very sensitive to
variable perturbations. In such cases, practitioners
are interested in finding robust solutions that are
less sensitive to small changes in variables (Deb
& Gupta 2004; Branke 1998).
f
X
,
f
X
,
,
f k
X
1
2
Search WWH ::




Custom Search