Image Processing Reference
In-Depth Information
New events can also be handled well, either by adding a new thread or task or interrupt to
watch for their occurrence or by adding an event handler to a queue.
Unfortunately these benefits of scalability and flexibility detract from being able to reason
about the execution semantics of a program, thus leading to an element of non-determinism.
One can thus consider the obvious potential for introducing stochastic ideas to qualitatively
and quantitatively understand how complex a design has become. If we can attach a simple
and fundamental measure to this complexity and thus generate a useful metric, it has the
capacity for comparing software designs or of providing a continuous benchmark of
growing potential complexity. That leads to the notion of supplying an entropy-based
measure to characterize the behaviour of a concurrent software system.
2. Approach - The context-switching entropy measure
The premise is to evaluate the complexity of a concurrently executing program. A novel
approach of creating such a complexity metric involves the analysis of the context-switching
frequency spectrum. We take a Fourier transform (
FT
) of the temporally distributed context
switching events,
c(t),
and treat that as a probability density function (Feller, 1957) in
frequency space (i.e. a normalized power spectrum).
()=()=
()∙
(1)
Then the Shannon information entropy (
S
) of
p(f)
(Leon-Garcia, 2008) will generate a simple
complexity measure.
=−
()∙ln()
(2)
The entropy of a spectrum has meaning because the power spectrum itself transforms the
autocorrelation of the signal in the time domain. The autocorrelation is nothing more than
the probability of a time-shift occurring between points of interest. Thus by transforming the
time domain information to the frequency domain we do not lose the essential information
pertaining to order and disorder in the signal, in particular when it comes to comparing one
waveform against another. The information containing the essential order is largely retained
across the transformation.
()≈()=
()∙(+)
(3)
So the Fourier transform represents the same probabilities as that of the autocorrelation but
computed in the frequency domain. For any captured sequence of events, the Fourier
transform of the autocorrelation is practically obtained as the magnitude of the Fourier
transform of the waveform.
()=()=|(())|
(4)
For a given mean frequency <
f
>, the value of
S
will be the greatest when the probability
spread of the context switching frequency components is maximized. In this case,
complexity tracks entropy and the maximum entropy principle (Jaynes, 2003) can be applied
to understand the trend toward greater complexity. By Parseval's theorem, we do not lose
the integrated signal when we take the Fourier transform.
This general approach is also known as Frequency Domain Entropy (Bao, 2004).