Information Technology Reference
In-Depth Information
Table 4.8. Experimental results of the BMO variants on the function rosenbrock with
noise. The BMO and the DMO are robust against noise.
ES
BMO cBMO DMO
best
0.449
0.008 0.223
1.004
median 5.236
0.027 2.864
2.876
worst
172.983 7.693 593.256 81.778
mean
20.890 0.378 42.305
7.298
dev
45.245
1.535 137.020 16.511
than the median of the ES and the Wilcoxon rank-sum test reveals statistical
superiority. The bad mean and worst values were produced by an outlier.
Wilcoxon Rank-Sum Test
BMO
cBMO
DMO
ES
1.841E-08 0.001462
0.000445
BMO
4.006E-08 2.059E-08
cBMO
0.4728
In order to prove the statistical relevance of the test data, we again perform
the Wilcoxon rank-sum test. BMO and DMO are better than the standard ES.
Their median and mean are below the values of the ES and all p w -values are
smaller than p l =0 . 05. Statistical superiority also holds for the cBMO in spite
of the worse mean value. But its median is better than the median of the ES.
The worse mean was obviously produced by outliers. Not surprisingly, the BMO
is significantly better than the cBMO and the DMO, while cBMO and DMO are
not significantly distinguishable from each other.
We summarize the obtained successful results of biased mutation on the un-
constrained real parameter optimization problems. Biased mutation achieves
considerable improvements on the function rosenbrock. The same holds for the
function rosenbrock with noise. No significant improvement or deterioration
could be observed on the tested unimodal functions.
4.5.2
Climbing Ridges with Biased Mutation
In this section we test the biased mutation on two real parameter optimization
problems with ridges. The ridge function class comprises functions of the fol-
lowing form:
d N
( α/ 2)
x i
f ridge ( x ):= x 1
(4.50)
i =2
We assume that a bias along the ridges increases the speed of the optimization
process and examine the following test functions ( d =1):
Search WWH ::




Custom Search