Hardware Reference
In-Depth Information
to the word “handling” in this context. First, the proposed methodologies can solve
the problems concretely and in a satisfactory manner: the Design Exploration tools
employed within the project can support all the phases of the process, since they pro-
vide appropriate solutions to define correctly the problem, they include algorithms
capable to optimize the selected metrics and they include many post-processing re-
sources. The second meaning refers to the validation path which builds the necessary
reliability to exploit the research results in an industrial context.
The definition of the problem is extremely flexible, but at the same time the
fixed XML vocabulary (described in Chap. 1) is universal in the sense that all the
components (tools and simulators) needed to work on the problem are able to speak
the same language. Different simulators with different level of abstraction can be
connected with the same optimization work-flow and the different optimization tools
can work with all the simulators without additional modifications.
The algorithms presented are obviously the central part of the process. Although
they follow different approaches, they all try to exploit the a priori knowledge of the
problem structure in order to better investigate the unknown objective space shape.
The presence of categorical variables is a first obstacle to overcome and indeed
many of the proposed algorithms implement particular strategies for handling this
kind of variables. Following this direction there is still room for improvements: is
it possible, for example, to design a categorical crossover operator? It should be
an operator which mixes information between the parent designs trying to maintain
possible structures or good combinations among their categorical variables.
The computational cost of the simulations is another important element to analyze.
The steady-state evolution implemented by some of the algorithm is a first answer.
However the final number of evaluations required to achieve an accurate and uniform
sample of the Pareto front is the key issue. Since all the MULTICUBE algorithms
seemed to perform equivalently well, the results obtained by MFGA on the complete
benchmark problem can represent a guarantee that also other algorithms can save
many simulations.
Other quality of the solution set have been considered besides accuracy. Unifor-
mity and extent are considered as complementary objectives. This opens a complete
new field of research: if the result of the optimization stage is a very detailed sample
of a large Pareto set, which are the points on the front that should be selected for
the prototyping stage? How is it possible to help the so called Decision Maker? This
stage has been considered as a separate step for long time, however recent research
results in optimization tries to combine the two steps. The objective is an algorithm
which returns a user-defined number of points taken from the Pareto set selecting
them for their diversity.
The validation strategy proposed has a twofold merit. On one hand, simply the fact
that a validation strategy has been addressed is relevant from the applicative point of
view, since this is the only way of building confidence on the proposed optimization
strategy. On the other hand, the validation procedure described in Sect. 3.4 contains
some elements that can constitute a paradigm for evaluating optimization algorithms.
A first element is to define a large set of indicators for the quality of the solution
sets: a single metric can hide more than what it shows, while a deep insight in the
Search WWH ::




Custom Search