Database Reference
In-Depth Information
Too fine an
approximation
max_error = E × 2 1
max_error = E × 2 4
2 2
max_error = E
×
2 5
max_error = E
×
Too coarse an
approximation
“Correct”
approximation
max_error = E × 2 3
max_error = E × 2 6
Fig. 4. We are most interested in comparing the segmentation algorithms at the set-
ting of the user-defined threshold max error that produces an intuitively correct level
of approximation. Since this setting is subjective we chose a value for E, such that
max error = E × 2 i
( i = 1 to 6), brackets the range of reasonable approximations.
Since we are only interested in the relative performance of the algo-
rithms, for each setting of max error on each data set, we normalized the
performance of the 3 algorithms by dividing by the error of the worst per-
forming approach.
3.2. Experimental Results
The experimental results are summarized in Figure 5. The most obvious
result is the generally poor quality of the Sliding Windows algorithm. With
a few exceptions, it is the worse performing algorithm, usually by a large
amount.
Comparing the results for Sine cubed and Noisy Sine supports our con-
jecture that the noisier a dataset, the less difference one can expect between
algorithms. This suggests that one should exercise caution in attempting
to generalize the performance of an algorithm that has only been demon-
strated on a single noisy dataset [Qu et al. (1998), Wang and Wang (2000)].
Top-Down does occasionally beat Bottom-Up, but only by small amount.
On the other hand Bottom-Up often significantly out performs Top-Down,
especially on the ECG, Manufacturing and Water Level data sets.
4. A New Approach
Given the noted shortcomings of the major segmentation algorithms, we
investigated alternative techniques. The main problem with the Sliding
Windows algorithm is its inability to look ahead, lacking the global view
of its oine (batch) counterparts. The Bottom-Up and the Top-Down
 
Search WWH ::




Custom Search