Information Technology Reference
In-Depth Information
compute the corresponding U by computing the distance and, thereafter, using the
last step of FCM algorithm.
4.7.3 Gustafson - Kessel Algorithm
In order to detect clusters of different geometrical shapes in one data set, the
standard FCM clustering algorithm is extended by employing an adaptive distance
norm (Gustafson and Kessel, 1979). In this case, each cluster has it's own norm-
inducing matrix A g , which yields the following inner-product norm:
T
2
Z
A
Z
,1
d d
g
c
;1
d d
s
N
;
(4.24a)
D
v
v
s
g
g
gsA
g
s
g
The matrices A g are used as optimization variables in the c -means functional, thus
allowing each cluster to adapt the distance norm to the local topological structure
of the data. The objective functional of the Gustafson-Kessel algorithm is defined
by:
^`
cN
m
JZUV A
;,,
P
2
(4.24b)
¦¦
D
gs
g
gsA
g
gs
11
This objective function cannot be directly minimized with respect to A g , since it is
linear in A g . To obtain a feasible solution, A g must be constrained in some way. The
usual way of accomplishing this is to constrain the determinant of A g :
det
,
!
0,
g
.
(4.24c)
UU
A
g
g
g
Allowing the matrix A g to vary, with it's determinant fixed, corresponds to
optimizing the cluster's shape while it's volume remains constant. By using the
Lagrange-multiplier method, the following expression for A g is obtained
(Gustafson and Kessel, 1979):
1/
n
ª
º
1
A
(4.24d)
U
det
F
F
g
g
g
¬
¼
g
where F g is the fuzzy covariance matrix of the g th cluster given by
m
N
T
P
Z
Z
¦
v
v
gs
g
g
s
s
s
1
F
;1
d d
g
c
.
(4.24e)
g
N
m
¦
P
gs
s
1
Note that the substitution of Equations (4.24d) and (4.24e) into (4.24a) gives a
generalized squared Mahalanobis distance norm, where the covariance is weighted
by the membership degrees in U . The Gustafson-Kessel algorithm is given in next
section. The Gustafson-Kessel algorithm is computationally more expensive than
Search WWH ::




Custom Search