Database Reference
In-Depth Information
, assign v to the j -th cluster, and update the weight vector
according to the reinforced learning rule:
￿ f d j
(
)
H
t
w j (
t
+
1
)=
w j (
t
)+ ʱ (
t
)[
v
w j ]
(3.3)
where H
(
t
)
is a hierarchy control function, and
ʱ (
t
)
is the learning rate.
, spawn a new node from w j at the position v .
Step 4. Update network parameters
￿
Alternatively, if d j >
H
(
t
)
￿
Decay
ʱ (
t
)
with time
￿
Decay H
(
t
)
monotonically, controlling the leafs of the tree
Step 5. Continue from step 2 until either
￿
There is no significant change in the SOTM
￿
All nodes are allocated AND there is no significant change in the SOTM
￿
A maximum number of epochs is reached
Scanning the input data in a random manner is essential for convergence in the
algorithm [ 70 , 71 ]. Consider that the data is scanned in such a way that the i-th
component of feature vector v is monotonically increasing, then the i-th component
of the weight vector w j will also monotonically increase (according to step 3). Thus
all of the nodes will have a monotonically increasing component in the i-th position
of the weight vector.
3.2.2.1
Hierarchical Control Function and Learning Parameters
In Step 3, if there is no significant similarity (i.e., d j >
), then the network
figures that it needs to allocate a new node to the network topology. This node
then becomes a child of the node it was found to be closest to. The hierarchical
control function decays, allowing for nodes to be allocated as leaf nodes of their
closest nodes from previous states of the network. Thus the SOTM forms a flexible
tree structure that spreads and twists across the feature space. The decay can be
implemented by linear and exponential functions:
H
(
t
)
1
e ˄ H H
H
(
t
)=
H
(
0
)
(
0
) / ʾ
·
t
(3.4)
t
˄ H
e
H
(
t
)=
H
(
0
)
(3.5)
where the time constant
˄ H is bound to the project size of the input data
T
; H
(
0
)
is
the initial value; t is the number of iterations; and
ʾ
is the number of iterations over
which the linear version of H
(
t
)
would decay to the same level as the exponential
version of H
(
t
)
.
 
Search WWH ::




Custom Search