Information Technology Reference
In-Depth Information
That batch algorithm requires that the whole data training set A be avail-
able. The contribution of the single observation z i
to parameter w c
to the
update is 2 K T ( δ ( c,χ ( z i )))( z i
w t− 1
c
). Alternatively, one may use the sto-
chastic gradient algorithm that computes the reference set once again, at each
presentation of an observation z i . That adaptive version is closer to training
processes in natural systems. It was the initial version that was suggested by
Kohonen. It differs from the batch version that was presented above in two
respects: first, the data flow is used instead of the stored data; second the
allocation function χ is not the same; Kohonen's algorithm uses the same as
in k -means: χ ( z i )=argmin c
2 .
Therefore, at each presentation of an observation, the new reference vectors
are computed for all the neurons of the map C, depending on the selected
neuron,
z i
w c
z i .
Thus, Kohonen's algorithm may be summarized as follows:
µ t K T ( δ ( c,χ t ( z i ))) w t− 1
w c = w t− 1
c
c
Kohonen's Algorithm
1. Initialization
select the structure and size of the map;
choose the initial position of the p reference vectors (usually, this choice
is random);
choose T max ,T min and the maximum number of iterations N iter ;
initialize t =0.
2. Iteration t : with the reference vector set W t− 1 , as computed at the pre-
vious iteration:
take the current observation z i
(or select randomly an observation
from the training set);
compute the new value of T according to the cooling schedule:
T = T max T min
T max
t
N iter 1
For that value of T , the following two phases must be performed:
-
Allocation phase : W t− 1 being known, neuron χ t ( z i ) is assigned
to the current observation z i
by the allocation function χ ( z )=
2 ;
arg min r
z
w r
Minimization phase : the new set of reference set W t is computed;
the reference vectors are updated according to
-
w c = w t− 1
µ t K T ( δ ( c,χ t ( z i )))( w t− 1
c
c
z i ) ,
depending on their distance to the neuron that was selected during
the allocation phase.
3. Iterate with decreasing temperature T , until the maximum number of
iterations N iter is reached.
 
Search WWH ::




Custom Search