Information Technology Reference
In-Depth Information
3
100
Training Error %
h
80
2
60
40
1
20
Epochs
Epochs
0
0
100
200
0
100
200
Fig. 6.7 Instability of the MEE back-propagation algorithm, when updating
h
in
an experiment with PB12 dataset.
6.1.1.2
Adaptive Learning Rate
Several authors, following different strategies, have shown that, by adapting
the value of the learning rate,
η
, along the learning process, one can get
a better and faster convergence of a neural network using the MSE risk
functional [20, 232, 36, 109, 186, 210, 72, 104, 145, 146].
We now show how
η
can be adjusted as a function of the error entropy,
in a similar way as when using MSE, using the following simple but effec-
tive rule: if the error entropy decreases between two consecutive epochs of
the training process, the algorithm then produces an increase in the learning
rate parameter; on the other hand, if the error entropy increases between two
consecutive epochs, then the algorithm produces a decrease in the learning
rate parameter and, furthermore, the updating step is restarted, i.e., we re-
cover the previous "good" values of all neural network weights. This simple
rule for learning rate updating can be written as [205]:
η
(
m−
1)
u
H
(
m
)
<H
(
m−
1)
if
η
(
m
)
=
H
(
m−
1)
,
u>
1
,
0
<d<
1
,
η
(
m−
1)
d
H
(
m
)
∧
restart
if
≥
(6.9)
where
η
(
m
)
and
H
(
m
)
are, respectively, the learning rate and the error entropy
at the
m
th iteration and
u
and
d
are the increasing and decreasing updating
factors.
A large number of experiments, reported in [205], were performed in order
to find adequate values for
u
and
d
; based on the respective results discussed
in detail in [205] the following values for
u
and
d
were proposed:
u
=1
.
2 and
d
=0
.
2. Figure 6.8 shows the results of one such experiment with a real-world
dataset, namely the evolution of the training error and of the error entropy us-
ing both fixed learning rate (FLR) and variable learning rate (VLR) rule (6.9)
using the proposed values of
u
and
d
. Clearly, the smallest classification error