Image Processing Reference
In-Depth Information
4.1 Applications to design
The difficulty in designing and debugging of such systems necessitates practical simulation
tools and simplified metrics such as the entropy measures described here. Both of these
entropy metrics are potentially useful during system development as an analysis of
alternatives or design space exploration tool. If several concurrent options are available for a
design, either of these metric could provide criteria to establish the least complex design. In
that sense, it serves as a simple utility function to measure unwanted or creeping complexity
much like a duty-cycle utilization measures processor contention.
4.2 Applications to test and verification
A close connection exists between the entropy metrics and usage modelling (Whittaker,
1994) for program verification. Most non-exhaustive testing requires a mix of tests taken
during nominal conditions along with tests sampled according to potentially rare
conditions. This consideration takes into account the number of test vectors and the path
coverage for testing. Any characterization at this stage will provide useful inputs to
generating a stochastic measure of reliability. This could incorporate stochastic usage
models and a log-likelihood metric is often used to compare between two state space
probabilities. Between an entropic measure and a usage model, we can cover the temporal
and path dimensions of a program's execution and its programmatic complexity.
4.3 Applications to diagnostics
As a diagnostic tool, the context switching metric can also detect potential complexities
during execution. Since the FFT can easily compute in real-time for typical sample sizes N ,
parallel execution of the entropy algorithm with the context switching data can reveal
deviations from expected operation. For example, if an execution profile shows a high
regularity of frequent context switches during some interval and then transitions to a more
irregular sequence of switches with the same overall density, the expected entropy measure
will definitely increase. In that sense, the entropy metric measures an intrinsic property of
the signal, and that strictly speaking, density fluctuations such as expected increases in the
rate of context switches will not influence the measure. In other words, density alone does
not affect the complexity.
By the same token, the multi-scale metric has obvious benefit for detecting long term
complexity changes or short-term bursts buried in a nominally sampled signal. The idea of
using frequency domain entropy for diagnostics of complex machinery is further explored
in (Shen, 2000).
5. Conclusion
We described a complexity metric for concurrent software controlled systems or concurrent
realizations of behaviour. The novel approach of creating a complexity metric for context
switching involves the analysis of the switching frequency spectrum. We take a Fourier
transform of the temporally distributed context switching events ( c(t) ) and treat that as a
probability density function in frequency space (i.e. a normalized power spectrum). Then
the entropy ( S ) of p(f) will generate a simple complexity measure.
The context switching metric can be used during system development as an analysis of
alternative utility function. If several design options or algorithms are available, the context-
Search WWH ::




Custom Search