Image Processing Reference
In-Depth Information
where κ is an empirically chosen weighting exponent controlling the amount of
fuzziness κ
[1 ,
[. The scalars u nk
[0 , 1] are the elements of the matrix
U =[ u 1 ,
, u K ], which represent a fuzzy partition of the feature vec-
tors such that a column u k represents the degree of belongingness of the feature
vector f k to each of the N c different classes. In such a partitioning, the class pro-
totypes are represented by the columns of the matrix V =[ v 1 ,
···
, u k ,
···
··· v N c , ].
Thus U , V are updated by the partitioning process, whereas the feature set F =
[ f 1 ,
···
, v n ,
f K ], which is the set to be categorized into N c classes, is fixed. The
elements d nk =
···
, f k ,
···
is an inner product induced norm on the M -
dimensional Euclidean space, E M , of the features, represents the squared distance
between two vectors in the feature space. Restated, the problem is to find the best
pair ( U , V ) that minimizes the error function e κ . It can be shown that [16] e ( U , V )
may be globally minimal for ( U , V ) only if
f k
v n
where
·
2 / ( κ− 1)
1
d nk
d jk
N c
u nk =
,
(16.8)
j =1
v n = K
/
K
( u nk ) κ f k
( u nk ) κ ,
for
n
1 ,
···
,N c .
(16.9)
k =1
k =1
The fuzzy C-means algorithm, which approximates a solution of the minimization
problem, can be stated as follows:
for E M ,fix κ ,
1. Start: Fix N c , choose a norm (based on an inner product)
·
, and initialize U (0) .
2. Update centers: Calculate the N c fuzzy cluster centers V =[ v 1 ,
where 1
≤∞
···
, v N c ] with
Eq. (16.9) and the current fuzzy partitions, U .
3. Update partitions: Update U using Eq. (16.8) and the current class centers, V .
4. Exit ? If
U l +1
U l
, then stop; otherwise go to step 2.
As κ
1, the fuzzy C-means converge to a “generalized” hard C-means solution
(ISODATA) [11], discussed in example 16.1 and Eq. (16.7). This algorithm always
reaches a strict local minimum for different initializations of U.
16.5 Establishing the Spatial Continuity
First, we consider the spatially isolated points in the resulting image obtained from
clustering. Such points are likely to be obtained when the clustering terminates, for
the following reason. Let γ n be the label associated to the class prototype of v n .
Then, the image position of a feature vector f k , inherits the class label γ n if d ( f k , v n )
is smallest compared to all N c classes. However, because the clustering process does
not “know” from which image location the feature vectors come, the spatial (i.e., in
the label image , which is Γ ) continuity is not evident, although in practice most
pixels will cluster together even spatially (in the image). At the coarsest resolution
Search WWH ::




Custom Search