Information Technology Reference
In-Depth Information
1−
unit with noise added to the membrane potential. Thus,
we can use the noisy XX1 activation function to simu-
late the average or expected effects of a spiking neuron,
or to represent the collective effects of a population of
spiking neurons.
Figure 2.15 also shows that the noisy XX1 function
has the key property of a sigmoidal activation function
(e.g., box 2.1), namely the saturating nonlinearity dis-
cussed previously. The saturation is due in the spiking
case to the increasingly limiting effects of the refractory
period, where the potential must recover to threshold af-
ter being reset following a spike. Similar kinds of satu-
rating nonlinearities have been suggested by other anal-
yses of the neural spiking mechanism and other synap-
tic effects (Ermentrout, 1994; Abbott, Varela, Sen, &
Nelson, 1997). Another aspect of the noisy XX1 func-
tion is that it emphasizes small differences in the mem-
brane potential in the vicinity of the threshold, at the
expense of representing differences well above or be-
low the threshold. The gain parameter ￿ can shrink or
expand this sensitive region around the threshold. We
will see the importance of these aspects of the activation
function when we put neurons together in networks in
the next chapter.
50−
0.8−
40−
spike rate
0.6−
30−
noisy x/x+1
20−
0.4−
10−
0.2−
0−
0−
−0.005
0
0.005
0.01
0.015
V_m − Q
Figure 2.15: Average spiking rate as a function of equilib-
rium membrane potential above threshold (threshold written
as Q instead of ￿ ) with constant excitatory and inhibitory
conductances, compared with the noisy x/x+1 function for the
same conditions (equation 2.20). Note that due to aliasing ef-
fects resulting from the discretization of time, parameters had
to be altered from their standard values (e.g., as shown in fig-
ure 2.14). Nevertheless, the general form of the function is
obviously well captured by the noisy x/x+1 function.
ing (and including) each point in the activation func-
tion, and adding these multiplied neighbors together to
give the new value at that point. Thus, these new values
reflect the probabilities with which neighboring points
could jump (up or down) to a given point when noise
is added. This is the same process used in the “blur-
ring” or “smoothing” operations in computerized im-
age manipulation programs. The result of this operation
is shown in figure 2.14. We call this new function the
noisy-X-over-X-plus-1 or noisy XX1 function.
As we expected, the noisy XX1 function has a softer
threshold, which gradually curves up from zero instead
of starting sharply at the threshold point as in the orig-
inal function. This is important for giving the neurons
a graded overall activation function, imparting all the
advantages of gradedness as discussed in section 1.6.2.
Note that this also means that there is some activity as-
sociated with subthreshold membrane potentials (i.e.,
due to noise occasionally sending it above threshold).
Another effect is that noise somewhat reduces the gain
(sharpness) of the activation function.
Figure 2.15 shows that the noisy XX1 function pro-
vides a good overall fit to the rate of discrete spiking in a
2.5.5
Summary
We have now covered all the major components of com-
putation at the level of the individual neuron, includ-
ing the computation of excitatory inputs as a weighted
function of sending unit activity, the integration of ex-
citatory, inhibitory, and leak forces (conductances), and
the thresholded, saturating activation output. We refer
to the collection of these equations as the point neuron
activation function or just the Leabra activation func-
tion. The major steps in this function are summarized
in box 2.2.
2.6
Explorations of the Individual Neuron
Now we will use the simulator to explore the proper-
ties of individual neurons as implemented by the point
neuron activation function just described. Before you
begin, there are two important prerequisites: First, the
software must be properly installed, and second, it will
Search WWH ::




Custom Search