Information Technology Reference
In-Depth Information
where
is a small enough tolerance parameter. Therefore, we here only consider the
inequality constraint functions
ε
G
.
The constrained optimization problems are generally difficult to deal with, because
the constraint functions can divide the whole search space into disjoint islands. Nu-
merous constraint-handling techniques have been proposed and investigated during
the past decades [11]. One popular solution is to define a new fitness function
g
(
x
)
0
,
i
=
1 "
2
,
I
i
F( x G to
)
F( x G is the combination of the objective function
f( x G and
be optimized [11].
)
)
G ,
weighted penalty terms
P x
i
)
i
=
1 "
2
,
I
, which reflect the violation of the con-
straint functions:
I
G
G
G
=
F(
x
)
=
f(
x
)
+
w
P
(
x
)
,
(5)
i
i
i
1
where
) are the preset weights. The overall optimization performance
depends on the penalty terms and their weights, and may significantly deteriorate with
inappropriately chosen ones. In this section, we propose a modified HS method for
the direct handling of these constraints.
w (
i
=
1 "
2
,
I
4.2 Modified HS Method for Constrained Optimization
As aforementioned, the new HM members are generated either from the existing HM
members or in a random way. Nevertheless, they are not guaranteed to always meet
all the constraints. Figure 3 shows that the new HM members, which satisfy all the
constraints, can be acquired based upon trial and error . This task is exhaustive, espe-
cially for complex constraint functions.
Fig. 3. Generation of new HM members by trial and error method
In our modified HS method, we make full use of those HM members that do not
even meet the constraints. The key issue is how to rank the HM members according to
their objective as well as constraint function values. Here, the values of the constraint
functions of the HM members are stored together with their objective function values in
the HM. The HM members are divided into two parts: feasible members and infeasible
Search WWH ::




Custom Search