Information Technology Reference
In-Depth Information
so that an expected activity level ￿ of .5 will be the same
as standard CPCA, but smaller values will produce rel-
atively larger weights.
Finally, to establish a continuum between the ba-
sic CPCA learning rule without renormalization and
the version with it, we introduce a parameter q m
( savg_cor in the simulator, which stands for “send-
ing average (activation) correction”) which is between 0
and 1 and specifies how close to the actual ￿ we should
go (starting at .5) in compensating for the expected ac-
tivity level. Thus, we compute the effective activity
level that we will use in equation 4.19 as follows:
Contrast enhancement can also be understood in the
context of the effects of soft weight bounding ,which
is a property of the CPCA algorithm. Because the
weights are constrained to live within the 0-1 range, and
they approach these extremes exponentially slowly, this
makes it difficult for units to develop highly selective
representations that produce strong activation for some
input patterns and weak activation for others. Thus,
contrast enhancement counteracts these limitations by
expanding the rate of change of weight values around
the intermediate range, while still retaining the advan-
tages of soft weight bounding.
We implement contrast enhancement using a sig-
moidal function, which provides an obvious mechanism
for contrast enhancement as a function of the gain pa-
rameter that determines how sharp or gradual the func-
tion is. This function can enhance the contrast between
weight values by transforming the linear relationship
between weight values (which in turn are conditional
probabilities computed by the CPCA algorithm) into a
sigmoidal nonlinear relationship mediated by a gain pa-
rameter. Biologically, this would amount to a differen-
tial sensitivity to weight changes for weight values in
the middle range as opposed to the extremes, which is
plausible but not established.
Note that this kind of contrast enhancement on the
weights is not equivalent to the effects of the more stan-
dard gain parameter on the activation values. Changing
the activation gain can make the unit more or less sen-
sitive to differences in the total net input (the sending
activations times the weights). In contrast, changing the
contrast enhancement of the weights affects each weight
value separately, and allows the unit to be more sensi-
tive at the level of each individual input instead of at the
level of the total input. Put another way, weight con-
trast enhancement gives a unit a more sensitive filter or
template for detecting patterns over its inputs, whereas
activation contrast enhancement just makes the net re-
sponse more sensitive around the threshold, but does not
increase the contrast of the signal coming into the unit.
In the simulator, we implement weight contrast en-
hancement by introducing an effective weight wij which
is computed from the underlying linear weight using the
(4.20)
and then compute m as m = : ￿ m .So,when qm =0 ,no
renormalization occurs, and when qm =1 , the weights
are maximally renormalized to fit the 0-1 range.
4.7.2
Contrast Enhancement
Whereas renormalization addresses the dynamic range
issue, contrast enhancement addresses the selectivity
problem with the basic CPCA algorithm. The idea is to
enhance the contrast between the stronger and weaker
correlations, such that the weights predominantly re-
flect the contribution of the stronger correlations, while
the weaker correlations are ignored or minimized. This
further accentuates the existing tendency of CPCA to
represent the principal component of the correlations.
Thus, contrast enhancement can be seen as accentuat-
ing the extent to which the transformations learned by
the hidden units emphasize some aspects of the input
pattern and deemphasize others, which we discussed in
chapter 3 as one of the most important functions that
hidden units perform.
Contrast enhancement constitutes an important
source of bias in favor of a simpler (more parsimonious)
representation of the strongest, most basic structure of
the environment over its weaker and more subtle as-
pects. Clearly, an appropriate balance between simplic-
ity and fidelity needs to be struck (as we will discuss at
greater length), but the basic CPCA equation tends to
be too heavily weighted toward fidelity, so that there is
a benefit to imposing an additional simplicity bias.
Search WWH ::




Custom Search