Digital Signal Processing Reference
In-Depth Information
cluster at the presentation of the n th training pattern during the k th training
session. At the beginning, m j , S j , and n j are appropriately initialized. For
each pattern presentation x
(
n
)
, n
=
1 , 2 ,
...
,M , during the k th session, the
winner neuron is found:
x
=
x
.
N ( k ) (
)
min
j
n
w ( k )
c
w ( k )
j
(
n
)
(
n
)
(
n
)
(
n
)
(11.94)
=
1
In Equation 11.94, N ( k ) (
is the number of output neurons, when the n th train-
ing vector is presented to the input of SOM during the k th training session.
For the sake of simplicity, let us first describe the test for detecting outliers in
the case of the first training session ( k
n
)
1), i.e., in case (i) of Algorithm 2. 70 , 71
=
More specifically, we decide that x
is not an outlier and, therefore, can be
merged with the patterns already represented by the winner neuron if
z ( k )
c
(
n
)
d T S ( k )
c
) 1 d
(
n
1
)
p
(
n
1
≤ F p,z ( k )
c
05 ,
(11.95)
(
n
1
)
p ;0
.
p
F
where
05 denotes the upper 5% level of significance for the F -distribution
p,z
p ;0
.
with p and z
p degrees of freedom, and d is given by
1 x
) .
1
m ( k )
c
d
=−
(
n
)
(
n
1
(11.96)
z ( k )
(
n
1
) +
c
It should be noted that the test (Equation 11.95) cannot be applied when z ( k )
c
(
is unconditionally merged with
the cluster of the training vectors represented by w ( k )
c
n
1
)<
p . Therefore, the input pattern x
(
n
)
. The same analysis is
also applied to case ii outlined in Algorithm 2 ( i.e., when c ( k ) (
(
n
)
) =
c ( k 1 ) (
)
n
n
,
with c ( k ) (
)
denoting the index of the winner neuron at the presentation of
the n th pattern during k th session, k
n
2).
If Equation 11.95 is satisfied, the winner vector is updated as SOM suggests: 6
) x
) .
w ( k )
c
w ( k )
c
w ( k )
c
(
n
+
1
) =
(
n
) + α(
k
(
n
)
(
n
(11.97)
Furthermore, the number of patterns, the sample mean, and the sample dis-
persion matrix of the cluster associated with either the winner c ( k ) (
n
)
in case i
or the previous winner c ( k 1 ) (
in case ii are updated. Next, we consider
case iii of Algorithm 2. It refers to the remaining patterns of a cluster, which has
been modified due to a removal of another pattern. Because c ( k ) (
n
)
c ( k 1 ) (
n
) =
n
)
,
for a moment we exclude the pattern x
from the cluster of patterns that is
represented by the winner, and we verify whether its inclusion to this cluster
is still valid by applying a test similar to Equation 11.95. For the remaining
neurons, all the corresponding parameters are left intact. In the following, we
consider what happens when x
(
n
)
is found to be an outlier. It is reasonable
then to examine whether the cluster represented by the winner neuron can
be split into two subclusters.
(
n
)
Search WWH ::




Custom Search