Digital Signal Processing Reference
In-Depth Information
P
∈
F
n
(
C
) represents a minimum of the function
J
(
·
,
L
)onlyif
u
C
(
x
j
)
I
j
=
∅ ⇒
u
ij
=
,
∀
1
≤
i
≤
n
;
1
≤
j
≤
p
(7.34)
n
d
2
(
x
j
,
L
i
)
d
2
(
x
j
,
L
k
)
k=1
and
I
j
=
∅ ⇒
u
ij
=0
,
∀
i
∈
I
j
(7.35)
and arbitrarily
i∈I
j
u
ij
=
u
C
(
x
j
).
Theorem 7.2:
If
L
R
sn
is a local minimum of the function
J
(
P,
), then
L
i
is the
cluster center (mean vector) of the fuzzy class
A
i
for every
i
=1
,
∈
·
···
,n
:
p
1
u
ij
x
j
L
i
=
(7.36)
j=1
p
u
ij
j=1
The alternating optimization (AO) technique is based on the Picard
iteration of equations (7.34), (7.35), and (7.36).
It is worth mentioning that a more general objective function can be
considered:
n
p
u
ij
d
2
(
x
j
,
L
i
)
J
m
(
P,
L
)=
(7.37)
i=1
j=1
with
m>
1 being a weighting exponent, sometimes known as a
fuzzifier
,
and
d
the norm-induced distance.
Similar to the case
m
= 2 shown in equation (7.28), we have two
solutions for the optimization problem regarding both the prototypes
and the fuzzy partition. Since the parameter
m
can take infinite values,
an infinite family of fuzzy clustering algorithms is obtained. In the
case
m
1, the fuzzy
n
-means algorithm converges to a hard
n
-
means solution. As m becomes larger, more data with small degrees
of membership are neglected, and thus more noise is eliminated.
→
Search WWH ::
Custom Search