Information Technology Reference
In-Depth Information
i
w
connecting the output of the perceptron
i
and the input of the perceptron
j
will
increase by an amount
' ,
wx y
K
ij
j
i
where
x
is the output of the perceptron
j
,
y
the output of the perceptron
i
, and
Ș
is
a measure controlling the learning step size (Figure 3.14). Accordingly, the
Hebbian learning updating the weights, or the
Hebbian learning rule
, can be
expressed as
K
wt
( )
wt
()
xtyt
()()
.
ij
ij
j
i
p
i
w
ij
p
j
y
i
x
j
Figure 3.14.
Interconnected perceptrons
x
1
w
1
w
2
x
2
y
:
:
:
:
:
:
x
n
w
n
Figure 3.15.
Multiple interconnected perceptron
The rule can be generalized and applied to a multiple-input perceptron as
T
wt
( ) ()
wt
K
x wx
,
where the relation
n
y
wx
wx
T
x w
T
¦
j
j
j
1
is taken into account (Figure 3.15).
Nevertheless, the direct application of the Hebbian rule bears the risk of an
endless increase of weight values, which could saturate the output neurons. As a
Search WWH ::
Custom Search