Information Technology Reference
In-Depth Information
system? Explain this pattern of results. (e) Explain
why both kinds of inhibition are useful for producing a
system that responds in a rapid but controlled way to
excitatory inputs.
they develop greater levels of variance in the amount of
excitatory input received from the input patterns, with
some patterns providing strong excitation to a given unit
and others producing less. This is a natural result of the
specialization of units for representing (detecting) some
things and not others. We can test whether the current
inhibitory mechanism adequately handles these changes
by simulating the effects of learning, by giving units ex-
citatory weight values with a higher level of variance.
Time Constants and Feedforward Anticipation
We just saw that feedforward inhibition is important for
anticipating and offsetting the excitation coming from
the inputs to the hidden layer. In addition to this feedfor-
ward inhibitory connectivity, the anticipatory effect de-
pends on a difference between excitatory and inhibitory
neurons in their rate of updating, which is controlled by
the vm_dt parameters dt.hidden and dt.inhib
in the control panel (cf. section 2.4.5, equation 2.7). As
you can see, the excitatory neurons are updated at .04
(slower), while the inhibitory are at .15 (faster). The
faster updating of the inhibitory neurons allows them to
more quickly become activated by the feedforward in-
put, and send anticipatory inhibition to the excitatory
hidden units before they actually get activated.
First, press Defaults to return to the default pa-
rameters. Run this case to get a baseline for compari-
son.
In this case, the network's weights are produced by
generating random numbers with a mean of .25, and a
uniform variance around that mean of .2.
Next set the wt_type field in the control panel to
TRAINED instead of the default UNTRAINED .
The weights are then initialized with the same mean
but a variance of .7 using Gaussian (normally) dis-
tributed values. This produces a much higher variance
of excitatory net inputs for units in the hidden layer.
There is also an increase in the total overall weight
strength with the increase in variance because there is
more room for larger weights above the .25 mean, but
not much more below it.
To verify this, click on Defaults ,set dt.inhib to
.04, and then Run .
The faster time constant also enables inhibition to
more rapidly adapt to changes in the overall excitation
level. There is ample evidence that cortical inhibitory
neurons respond faster to inputs than pyramidal neurons
(e.g., Douglas & Martin, 1990).
One other important practical point about these up-
date rate constants will prove to be an important advan-
tage of the simplified inhibitory functions described in
the next section. These rate constants must be set to be
relatively slow to prevent oscillatory behavior.
Press Run to see what difference this makes for the
overall excitatory level.
You should observe a greater level of excitation using
the TRAINED weights compared to the UNTRAINED
weights.
, !
You can verify that the system can compensate for
this change by increasing the g_bar_i.hidden to 8.
To see this, press Defaults , and then set
dt.inhib to .2, and dt.hidden to .1 and Run .
These oscillations are largely prevented with finer
time scale upgrading, because the excitatory neurons
update their activity in smaller steps, to which the in-
hibitory neurons are better able to smoothly react.
Bidirectional Excitation
, !
To make things simpler at the outset, we have so far
been exploring a relatively easy case for inhibition
where the network does not have the bidirectional exci-
tatory connectivity that overwhelmed the constant leak
counterweight in section 3.4.3. Now, let's try running a
network with two bidirectionally connected hidden lay-
ers (figure 3.22).
Effects of Learning
One of the important things that inhibition must do is to
compensate adequately for the changes in weight val-
ues that accompany learning. Typically, as units learn,
First, select Defaults to get back the default
parameters, do a Run for comparison, and then set
network to BIDIR_EXCITE instead of FF_EXCITE .
Search WWH ::




Custom Search