Image Processing Reference
In-Depth Information
Fig. 16.7. The function relating the dissimialrity to the central coefficient of the butterfly filter
We still need to define the filters for different orientations θ . This is done by rotat-
ing the mask defined for the horizontal direction θ =0and redistributing the weights
for matching the grid of an image. The filters are computed for a fixed number of di-
rections (eight here) and are stored in the memory, i.e., a lookup table.Starting from
the coarsest level l = L , the boundary refinement procedure can be enumerated as
follows:
1. Project down the labels, i.e., Γ ( r k ,l
1) = Γ ( r k / 2 ,l ) and define the boundary
n and variance ( σ n ) 2 for all classes.
2. For each boundary pixel at level l compute the dominant direction θ using Eqs.
(16.10) and (16.11). Determine the two classes A , and B on both sides of the
boundary. Choose the corresponding butterfly filter, and propagate its identity to
the corresponding pixels at level l
region β at level l
l . Compute the mean
μ
1.
3. For each pixel r k
β , apply the filter corresponding to the current position to
(all components of) the feature vectors f ( r k ,l
1). If a feature-vector participat-
ing to averaging is outside of β , take the vector of its corresponding prototype.
The left and right halves of the filters are applied separately, reducing the risk
of smoothing across the boundaries. Two responses are obtained, h A ( r k ,l
1)
and h B ( r k ,l
1). This smoothing is repeated a certain number of times found
empirically (four in the examples) using a small filter size (3
3 here). This
is equivalent to use a large filter size in one iteration, but it is computationally
faster.
4. For each pixel r k , compute the four distances between the two filter responses
h A , h B
×
A ,
B
and the prototypes
μ
μ
corresponding to the classes A, B on each
A
h A
A
h B
B
h A
B
side of the boundary region, i.e.,
μ
,
μ
,
μ
,
μ
Search WWH ::




Custom Search