Information Technology Reference
In-Depth Information
Update equation for parameters in output layer
f ( (
f ( (
ʔ
w mn = ʷ
Y m ( (
e n )
V n )) +
j
(
e n )
V n )))
(4.17)
f ( (
f ( (
ʔ
w 0 n = ʷ
z 0 ( (
e n )
V n )) +
j
(
e n )
V n )))
(4.18)
The update equations for parameters between input and hidden layer are as follows:
z l ʻ m ʓ m
ʾ m
N
w lm =
ʔ
+
j
(4.19)
exp
2 z l
lm ʓ m m ) + ʾ m m )
(4.20)
2
N
w RB
lm
W RB
m
w RB
ʔ
=
Z
N W m
Z T ʓ m
ʾ m
ʔʻ m =
+
j
(4.21)
exp
2 ʓ m
ʾ m
N
W RB
m
ʔʳ m =
Z
+
j
(4.22)
z 0 ʓ m
ʾ m
N
ʔ
w 0 m =
+
j
(4.23)
N
ʓ m
f ( (
V m ))
f ( (
f ( (
where
=
1 { (
e n )
V n )) (
w mn ) + (
e n )
V n )) (
w mn ) }
n
=
N
ʾ m
f ( (
V m ))
f ( (
f ( (
and
=
1 { (
e n )
V n )) (
w mn ) (
e n )
V n )) (
w mn ) }
n
=
4.3.6 Model-3
Over the years, a substantial body of evidence has grown to support the presence
of nonlinear aggregation of synaptic inputs in the neuron cells [ 1 , 2 ]. This section
exploit a novel fuzzy oriented averaging concept, appeared in [ 35 ], to design a very
general nonlinear aggregation function whose special cases appear in various types
of existing neural aggregation functions. In general, an aggregation operation can
be viewed as averaging operation [ 38 ]. A brief survey into the history of averaging
operations brings out the fact that in 1930 Kolmogoroff [ 39 ] and Nagumo [ 40 ]
acknowledged the family of quasi-arithmetic means as a most general averaging
operations. This family has been defined as follows:
x n ; ˉ 1 2 ...ˉ n ) = ˆ 1
n
M ˆ (
x 1 ,
x 2 ...
ˉ k ˆ(
x k )
(4.24)
k
=
0
 
 
Search WWH ::




Custom Search