Image Processing Reference
In-Depth Information
The multi-scale aspect senses the different temporal frequencies in the underlying signal,
comparable to what the contest-switching metric does, but instead pairs or groups the data
points to measure a different coarse graining effect. This works out fairly straightforwardly
in terms of a autocorrelation. The essential algorithm groups adjacent time samples together
in a window of length Scale as the coarse graining or moving average measure. Then it
counts the number of times, n , that the amplitude will change from one coarse-grained time
step to the next.
If the amplitudes don't change for a given coarse-grain then it is predictable and the entropy
will be low. To calculate the sample entropy they calculate
( ) =−log ( )
()
(6)
over each of the scale factors Scale = 1 .. maxScaleFactor .
3.1 Usage domains
A graph of the multi-scale entropy will appear flat if it is measuring "1/f" (van der Ziel,
1950) or the so-called pink noise as the underlying behaviour. Pink noise shows a
predictable constant change of amplitude density per scale factor; in other words it has a
constant energy per frequency doubling while white noise shows constant per frequency
interval (Montroll & Shlesinger, 2002).
In comparison to the structure-less noise, if structure does exist in the signal, you will see
observable changes in the entropy from one scale factor to the next. For example a
superimposed sine wave would show a spike downward in sample entropy when it crossed
a harmonic in the scale factor.
A simple interpretation suggests that we scale the measured results relative to the 1/f noise
part of the signal. The 1/f noise includes the greatest variety of frequencies of any behaviour
known and therefore the highest entropy (Milotti, 2002). So by providing a good
visualization or graph that plots the 1/f asymptotic value we can immediately gauge the
complexity of a signal. Costa et al discuss the difficulty of distinguishing between
randomness and increasing complexity, which has importance in the realm of event-driven
systems.
"In fact, entropy-based metrics are maximized for
random sequences, although it is generally
accepted that both perfectly ordered and
maximally disordered systems possess no
complex structures. A meaningful physiologic
complexity measure, therefore, should vanish for
these two extreme states."
This is a key insight and one echoed by researchers in complexity theory (Gell-Mann, 1994)
in that the most interesting and challenging complexity measures occupy the middle range
of the complexity scale. In other words, the most ordered signals can be described by a few
harmonic periods and at the other extreme, the complexity reduces to simple stochastic
measures akin to statistical mechanics (Reif, 1965). In between these extremes, we require a
different level of sophistication.
Search WWH ::




Custom Search