Information Technology Reference
In-Depth Information
13.5
AXIOM 2 IN SOFTWARE DFSS
13.5.1
Axiom 2: The Information Axiom
13.5.1.1 MinimizetheInformationContentinaDesign. The second axiom
of axiomatic design stated previously provides a selection metric based on design
information content. Information content is defined as a measure of complexity, and it
is related to the probability of certain events occurring when information is supplied.
Per axiom 2, the independent design that minimizes the information content is the
best. However, the exact deployment of design axioms might not be feasible because
of technological and/or cost limitations. Under these circumstances, different degrees
of conceptual vulnerabilities are established in the measures (criteria) related to the
unsatisfied axioms. For example, a degree of design complexity may exist as a result of
an axiom 2 violation. Such a vulnerable design entity may have questionable quality
and reliability performance even after thorough operational optimization. Quality
and reliability improvements of weak conceptual software entities usually produce
marginal results. Before these efforts, conceptual vulnerability should be reduced, if
not eliminated. Indeed, the presence of content functional coupling and complexity
vulnerabilities aggravates the symptomatic behavior of the software entities.
13.5.2
Axiom 2 in Hardware DFSS: Measures of Complexity
In hardware design, the selection problem between alternative design solution entities
(concepts) of the same design variable (project) will occur in many situations. Even
in the ideal case, a pool of uncoupled design alternatives, the design team still needs
to select the best solution. The selection process is criteria based, hence axiom 2. The
information axiom states that the design that results in the highest probability of FRs
success ( Prob(FR1), Prob(FR2),
, Prob(FRm) ) is the best design. Information and
probability are tied together via entropy, H .Entropy H may be defined as
...
H
=−
log
(Pr ob )
(13.18)
ν
Note that probability “Prob” in (13.18) takes the Shannon (1948) entropy form
of a discrete random variable supplying the information, the source. Note also that
the logarithm is to the base
ν
ν =
2(e) , 6
, a real nonnegative number. If
, then H is
measured in bits (nats) .
The expression of information and, hence, design complexity in terms of prob-
ability hints to the fact that FRs are random variables themselves, and they have
to be met with some tolerance accepted by the customer. The array
FR
are also
{
}
functions of (the physical mapping) random variables, and the array
, which
in turn, are functions (the process mapping) of another vector of random variables,
the array
DP
{
}
PV
. The PVs downstream variation can be induced by several sources
{
}
6 e is the natural logarithm base
Search WWH ::




Custom Search