Information Technology Reference
In-Depth Information
one of the values x 1 ;
p n . Then,
the expected degree of uncertainty (randomness) in the system that is dependent
upon X is
x 2 ; ...;
x n with respective probabilities p 1 ;
p 2 ; ...;
H ð X Þ¼ X
i
p i log ð p i Þ:
ð 10
:
4 Þ
This is information entropy [68] of the random variable X—the average
amount of uncertainty associated with the random variable X.
It is also worth mentioning that Equation 10.3 relates logic and thermal
energy, hence providing a means of measuring the effect of thermal perturbation.
The thermal energy KT (K is the Boltzmann constant and T is the temperature in
Kelvin) is normalized to the logic or clique energy. For example, KT=0.1 can be
interpreted as unit logic energy being 10 times the thermal energy. The logic
margins of nodes in a Boolean network decrease at higher values of KT and
increase at lower values. The logic margin in this case is the difference between the
probabilities of occurrence of a logic low and a logic high. Higher logic margins
result in decrease in entropy or uncertainty in computation, and hence better
reliability of computation. If we consider high thermal perturbations, the
reliability of computation is likely to be adversely affected; if we can keep our
systems far from these temperature values, the reliability is likely to improve. The
model of computation in [65] thus considers such thermal perturbations, along
with discrete errors and continuous signal noise, as sources of errors.
10.4.5. Scalability Problem
Evaluating the reliability of large circuits is important since realistic electronic
systems are significantly large, having millions to billions of devices. The
aforementioned reliability analysis methodologies do not scale well and, hence,
are not efficient at evaluating sufficiently large circuits. For a circuit with m inputs
and n outputs, the PMC-based methodology has a worst-case space and run-time
complexity of O(2 m+n ). This is because the PMC-based methodology involves the
complete enumeration of all the possible input and output combinations.
Although multiterminal binary decision diagrams (MTBDD) [42] are used to
compress the state space representation, there is a high demand on memory and
run-time for analyzing large circuits. The space and time complexity of a circuit
PTM is also of order O(2 m+n ). The PTMmethod uses algebraic decision diagrams
(ADD) [64], synonymous to MTBDDs [63, 64], to compress the PTM representa-
tion. Similar to the PMC-based methodology, this compression does not help in
reducing the space and time complexity for many typical large circuit PTMs. The
PGM-based methodology has also been shown to have run-time issues for circuits
with large m [32].
Figure 10.13 shows a circuit with inputs A0, B0, C0 and D0, each having a
probability of 0.9 of being logic high. The NAND and NOR gates can fail
independently with a probability of 0.001. There are four intermediate outputs o1,
 
Search WWH ::




Custom Search