Information Technology Reference
In-Depth Information
if
E
(
t
1
)
)
(w) <
E
(
t
×
0 and E
(
t
)>
E
(
t
1
)
(w)
then
(w(
t
+
1
)) = (w(
t
)) (w(
t
1
))
(3.30)
if
(
)
)
(w) <
(
E
t
1
E
t
×
(
)>
(
)
0 and E
t
E
t
1
(w)
then
(w(
t
+
1
)) = (w(
t
)) (w(
t
1
))
(3.31)
The initialization parameters are same as in case of conventional C RPROP and
the change is only in the case where partial derivative changes its size. The resulting
algorithm shows better performance as compare to C RPROP. Let
w lm be the weight
from l th neuron in a layer to m th neuron in next layer in a neural network. Let t = 1
and 0
+ <
.
1
2. The pseudo code for this C - i RPROP algorithm is given
below:
)) = 0 and
)
(w lm )
E
(
t
1
=
)
(w lm )
E
(
t
1
l
,
m
: ( lm (
t
)) = ( lm (
t
=
0
( max ) = ( max ) = max and
( min ) = ( min ) = min
Repeat {
calculate
)
(w lm )
E
(
t
and
)
(w lm )
E
(
t
for all weights and biases
For real part of weight:
if
0 then
)
(w lm )
E
(
t
1
)
(w lm ) >
E
(
t
×
)) × μ + , ( max ))
{ ( lm (
t
)) =
min
( ( lm (
t
1
sign
)
(w lm )
E
(
t
(w lm (
t
)) =−
( ij (
t
))
(w lm (
t
+
1
)) = (w lm (
t
)) + (w lm (
t
)) }
if
0 then
)
(w lm )
E
(
t
1
)
(w lm ) <
E
(
t
×
)) × μ , ( min ))
{ ( lm (
t
)) =
max
( ( lm (
t
1
(
(
)>
(
))
if
E
t
E
t
1
then
)
(w lm ) =
E
(
t
(w lm (
t
+
1
)) = (w lm (
t
)) (w lm (
t
1
))
and
0
}
 
Search WWH ::




Custom Search