Information Technology Reference
In-Depth Information
We obtain -4 as the minimum (can you do better?) We now try at dimension 7. We
will use a larger search space and more iterations. Indeed, our option settings were de-
termined by trial and error. Later we will say more about how this might systematically
be done.
Timing[
{
{
{
min , mat
min , mat
min , mat
}
}
}
= detMin[7 ,. 1 , 80 , 80]]
= detMin[7 ,. 1 , 80 , 80]]
= detMin[7 ,. 1 , 80 , 80]]
Timing[
Timing[
{
54 . 6874 ,{−
576 .,{{
1 , 1 ,−
1 ,−
1 , 1 ,−
1 , 1
},{
1 , 1 ,−
1 , 1 ,−
1 ,−
1 ,−
1
},
{
1 ,−
1 , 1 , 1 , 1 ,−
1 ,−
1
},{−
1 ,−
1 ,−
1 , 1 , 1 , 1 , 1
},{
1 , 1 ,−
1 ,−
1 , 1 , 1 ,−
1
},
{
1 , 1 , 1 , 1 , 1 , 1 , 1
}
,
{
1 ,
1 ,
1 ,
1 ,
1 , 1 , 1
}}}}
Readers familiar with the Hadamard bound for absolute values of matrix determinants
will recognize that the minimum must be no smaller than the ceiling of
7 2 ,or
907.
(In brief, this bound is the product of the lengths of the rows of a matrix; for our family,
the maxinal length of each row is 7. That this product maximizes the absolute value
of the determinant can be observed from the fact that this absolute value is the volume
of the rectangular prism formed by the row vectors of the matrix. This volume can be no
larger than the product of their lengths; it achieves that value precisely when the rows
are pairwise orthogonal.)
We can ask how good is the quality of our result. Here is one basis for comparison. A
random search that took approximately twice as long as the code above found nothing
smaller than
288. Offhand I do not know if -576 is the true minimum, though I suspect
that it is.
It is interesting to see what happens when we try this with dimension increased to
eight.
Timing[
{
{
{
min , mat
min , mat
min , mat
}
}
}
= detMin[8 , 1 / 50 , 100 , 200]]
= detMin[8 , 1 / 50 , 100 , 200]]
= detMin[8 , 1 / 50 , 100 , 200]]
Timing[
Timing[
{
222 . 618 ,
{−
4096 .,
{{
1 ,
1 , 1 , 1 , 1 ,
1 ,
1 ,
1
}
,
{−
1 , 1 ,
1 , 1 , 1 , 1 ,
1 ,
1
}
,
{−
1 , 1 , 1 , 1 , 1 ,
1 , 1 , 1
}
,
{
1 , 1 ,
1 , 1 ,
1 ,
1 ,
1 , 1
}
,
{−
1 ,
1 ,
1 ,
1 , 1 ,
1 ,
1 , 1
}
,
{
1 , 1 , 1 ,
1 , 1 , 1 ,
1 , 1
}
,
{
1 , 1 ,
1 ,
1 , 1 ,
1 , 1 ,
1
}
,
{
1 ,
1 ,
1 , 1 , 1 , 1 , 1 , 1
}}}}
In this case we actually attained the Hadamard bound; one can check that the rows
(and likewise the columns) are all pairwise orthogonal, as must be the case in order
to attain the Hadamard bound. Indeed, when the dimension is a power of two, one
can always attain this bound. The motivated reader might try to work out a recursive
(or otherwise) construction that gives such pairwise orthogonal sets.
4.4
Partitioning a Set
The last sections were a warmup to the main focus of this chapter. We introduced a bit
of Mathematica coding, and in particular use of Differential Evolution, in the contect
of discrete optimization. We now get serious in discussing combinatorial optimization
problems and techniques.
Search WWH ::




Custom Search