Information Technology Reference
In-Depth Information
solving for the equilibrium weight value, show that the
weights converge in the asymptotic case to the condi-
tional probability of equation 4.11.
First, we need to re-express the relevant variables in
probabilistic terms. Thus, the activations of the sending
and receiving units will be assumed to represent prob-
abilities that the corresponding units are active, which
is consistent with the analysis of the point neuron func-
tion in terms of Bayesian hypothesis testing presented
in chapter 2. We use the expression P (y j jt) to repre-
sent the probability that the receiving unit y j is active
given that some particular input pattern t was presented.
we can rewrite the preceding equation as:
P ( y
(4.15)
at which point it becomes clear that this fraction of the
joint probability over the probability of the receiver is
just the definition of the conditional probability of the
sender given the receiver. This is just equation 4.11,
which is right where we wanted to end up.
4.5.3
Biological Implementation of CPCA Hebbian
Learning
represents the corresponding thing for the send-
ing unit x i . Thus, substituting these into equation 4.12,
the total weight update computed over all the possible
patterns t (and multiplying by the probability that each
pattern occurs, P ( t ) )is:
At the beginning of this chapter, we described the bio-
logical mechanisms thought to underlie weight changes
in the cortex, and showed how these generally support
a Hebbian or associative type of learning. However,
the CPCA learning rule is slightly more complex than
a simple product of sending and receiving unit activa-
tions, and thus requires a little further explanation. We
will see that we can account for the general characteris-
tics of weight changes as prescribed by the CPCA learn-
ing rule (equation 4.12) using the same basic NMDA-
mediated LTP/D mechanisms that were described pre-
viously. For reference, the CPCA equation is:
(4.13)
As before, we set ￿ w ij to zero to find the equilibrium
weight value, and then we solve the resulting equation
for the value of w ij . This results in the following:
(4.16)
￿ w
= ￿y
( x
￿ w
For our first pass at seeing how the biology could im-
plement this equation, let's assume that the weight is at
some middling value (e.g., around .5) — we will con-
sider the effects of different weight values in a moment.
With this assumption, there are three general categories
of weight changes produced by CPCA:
1. When the sending and receiving units are both
strongly active (and thus x i > w ij ), the weight
should increase (LTP). We can easily account for
this case by the
(4.14)
associative nature of
NMDA-
P ( y
j t ) P ( t )
mediated LTP.
2. When the receiving unit is active, but the sending
unit not (i.e., xi <wij ), then LTD will occur. This
case can be explained by the NMDA channels be-
ing open (as a function of postsynaptic activity un-
blocking the Mg + ), and the small amount of presy-
naptic activity causing a small but above zero level
Now, the interesting thing to note here is that the nu-
merator
is actually the defini-
tion of the joint probability of the sending and receiving
units both being active together across all the patterns
P ( y
j t ) P ( x
j t ) P ( t )
, which is just P ( y j ;x i ) . Similarly, the denominator
P
gives the probability of the receiving
unit being active over all the patterns, or P ( y j ) . Thus,
P ( y
j t ) P ( t )
Search WWH ::




Custom Search