Hardware Reference
In-Depth Information
The actual value of w 0 is determined during the training phase by exploiting a set of
observations y ( x ) known as training set . The final value w 0 is such that the estimation
error:
ε
=
( x , w 0 )
y ( x )
(4.3)
is minimized for both the known and future observations y ( x ).
In more sophisticated cases, the structure of the function ρ ( x ) is not defined a-
priori but it is built by either using neural -like processing elements or composing
elementary functions. In both cases, the training phase does not involve (only) the
selection of parameters w 0 but the navigation through a function-space to identify the
optimal ρ ( x ) that minimizes error ε . Of course, while prolonging the overall training
process, these sophisticated model selection algorithms lend to better approximating
functions ρ ( x ).
4.2.1
RSM Categories
Response surface models are surrogate models which fit, within reasonable approx-
imation limits, the response curve of a system with respect to the configuration x .
Being a curve fitting tool, RSMs can be categorized as follows:
￿
Interpolation-based RSMs . This category of curve fitting expressions is built
with a constraint such that the curve ρ ( x ) is equal to f ( x ) for all the design points
x belonging to the training set T :
ρ ( x )
f ( x ),
x
T
(4.4)
while, for the remaining design points that still belong to the design space, w 0 are
calibrated such that the estimated error ε is minimal.
￿
Regression-based RSMs . This category of curve fitting expressions is such that
the constraint in Eq. 4.4 does not hold; instead, the coefficients w 0 are chosen such
that a general measure of error on known training set T and the future observations
is minimized.
Figure 4.3 shows a comparison between the two approaches when fitting a set of five
observations of the Energy-Delay Product when varying the system cache size. As
can be seen, the interpolation line (a spline function) passes through the observations
while the regression curve (a second order polynomial) does not. Interpolation ze-
roes the error on the training observations but it might present an over-fitting effect
that consists of a decreased prediction accuracy on the unknown observations. On
the other hand, regression techniques, albeit with some errors on known observa-
tions, may present a greater generalization power. Nevertheless, in this topic we will
analyze both techniques in the domain of design space exploration.
Search WWH ::




Custom Search