Graphics Reference
In-Depth Information
existence of the background. Entropy is a measure of expected gain in informa-
tion or expected loss of ignorance with an associated probability distribution.
Thus, H ( O
B ) can also be viewed as average loss of ignorance about the
object when we are told about the background. Similar interpretation is also
applicable to H ( B
|
O ). Hence, maximization of H T C is expected to result in
a good threshold. H T C can also be viewed as a measure of contrast.
Let th be the correct threshold for an object/background segmentation.
Now if th is used to partition the co-occurrence matrix, entries in quadrants
two and four in Figure 2.1 will have low frequencies, but expected to be more or
less uniformly distributed. Similarly, for the first and third quadrants, frequen-
cies also will be uniformly distributed but with high values, because within a
region, frequencies of transition from one level to another will be high. How-
ever, as far as the two dimensional probability distribution is concerned, all
cells will have more or less uniform probability mass function. Now suppose
the assumed threshold s is less than th . The second quadrant will have some
high frequencies that are actually transitions within the object. In addition
to this, it will also have actual low frequency transitions from object to back-
ground (i.e., across the boundary). Thus, the second quadrant will have a
highly skewed probability distribution resulting in a drastic lowering of H T C .
|
Fig. 2.1. Partitioning of the co-occurrence matrix for thresholding.
The uniformity of quadrant one will be maintained, but that of quadrants
three and four will be affected causing a lowering of entropy of quadrants
three and four. Similarly, if the assumed threshold is more than th , H T C
will
Search WWH ::




Custom Search