Digital Signal Processing Reference
In-Depth Information
where f(x) is an objective function, x denotes the decision solution vector, p is
the number of inequality constraints, and q is the number of equality constraints
(in both cases, constraints could be linear or nonlinear). Feasible individuals satisfy
all constraints while infeasible individuals do not satisfy at least one constraint.
Then, a solution candidate x is feasible if and only if g i (x)
0
i =
1 , 2 ,...,p
and h j (x) =
1 , 2 ,...,q holds. Obviously, only a feasible individual can be
a solution, i.e., an optimum, for a given optimization problem.
A number of approaches have been proposed by incorporating constraint-
handling techniques into evolutionary algorithms to solve constrained optimization
problems. Comprehensive surveys about such approaches can be found in [ 8 , 16 ].
Most of the evolutionary constraint handling methods can be broadly classified
into five categories [ 16 ]: (i) methods based on preserving feasibility of solutions,
(ii) methods based on penalty functions, (iii) methods making distinction between
feasible and infeasible solutions, (iv) methods based on decoders, and (v) hybrid
methods.
The penalty function method is widely regarded as the most popular constraint-
handling technique due to its simple principle and ease of implementation. In this
approach, the constraints are incorporated into the objective function so that the
original constrained problem is transformed into unconstrained one by adding (or
subtracting) a penalty term to (or from) the objective function for points not lying
in the feasible set and thus violating some of the constraints. This method typically
generates a sequence of infeasible points, approaching optimal solutions to the orig-
inal problem from the outside (exterior) of the feasible set. The general formulation
of the exterior penalty approach is:
0
j =
p
.
q
φ(x)
=
f(x)
±
r i ×
ζ i +
c j ×
ϑ j
(5.36)
i
=
1
j
=
1
Here, the constraints are combined with the objective function f(x) , resulting in
a new (expanded) objective function φ(x) which is then actually optimized. ζ i and
ϑ j are functions of the constraints g i (x) and h j (x) , respectively, and r i and c j are
positive constants, called “penalty factors”.
The general formulation of ζ i and ϑ j is:
ζ i = max 0 ,g i (x) α ,
(5.37)
h j (x)
β
ϑ j =
where α and β are normally 1 or 2.
If an inequality constraint is satisfied, then g i (x)
will
return 0, and therefore that constraint will not contribute anything to the function φ .
If a constraint is violated, i.e., g i (x)> 0or h j (x)
0 and max
{
0 ,g i (x)
}
0, a large term will get added
to φ such that the solution is pushed back towards the feasible region.
Another approach for handling constraints would be to consider the objective
function and the constraints separately. The constraint handling method described
by Deb [ 9 ] proposes using a binary tournament selection operator and applies the
following rules to compare two individuals:
=
Search WWH ::




Custom Search