Information Technology Reference
In-Depth Information
does not produce consistent weight patterns (chapter 6).
Do View , TEST_LOG , which brings up the
Trial_1_TextLog . Press the full-forward ( >| )VCR
button on the top to see the previous trial.
The translation of the actual output produced is
shown in the output column, and the identity of the
input pattern is shown in column Event .
The event is coded by an initial number representing
the index (0 to 388) of the verb, followed by its target
pronunciation, followed by a code indicating inflection
type (1-5 in the order of table 10.10) and the regular-
ity status of the verb (1 = irregular, 2 = regular). Thus,
the 11 here represents the base inflection of an irreg-
ular verb. You can also see that there were no errors
( sum_se is 0), and that the network took 30 cycles to
produce the output.
Looking back at the network, notice the 4 active units
next to each other in the second to last row of the seman-
tic input. These 4 units indicate that the base inflection
is to be produced. When you step to the next word,
you will see that the 4 units adjacent to the previous 4
are now activated, indicating the past inflection is to be
produced (i.e., “was” in this case).
Click on the 3 adjacent past-tense units to the right
of the previous one.
You should observe that indeed the sending weight
patterns are quite consistent. You can also check that
the other inflectional semantics units consistently tend
to activate a different set of hidden units (except the past
participle seems to activate a subset of the same units
activated by the past-tense units).
Click on all the other inflectional semantics units and
observe the sending weight patterns to the hidden layer.
Now, let's go back to the firstpasttenseinflectional
unit and mark the most strongly connected hidden units
for subsequent probing. We do this by selecting units
according to the strength of the weights.
, !
Press Selections/Select Units in the network
window (right hand side menu). In the popup window,
type in s.wt for the variable , select > for the relation-
ship ( rel ), and .5 for the comparison value ( cmp_val ).
You should see the three “brightest” hidden units are
now selected. We will try to interpret the weights for
one of these units.
, !
StepTest to “was,” and then continue to StepTest
through the remaining inflections of this verb (“is,”“be-
ing,” and “been”).
Now, let's skip to a regular word, “care” (/kAr/),
and see all five of its inflections (i.e., “care,”“cared,”
“cares,”“caring,” and “cared”).
Click on the second from the left of the three se-
lected hidden units.
You can now see the sending weights for this unit
— you should see that this unit doesn't favor any par-
ticular onset phonemes (all are uniformly weighted),
but it has a clear pattern of weighting for particular
coda phonemes. Let's try to interpret the pattern of
weighting for the last non-inflection coda slot (second
from the last slot, where each slot is 2 columns wide),
which is where the regular past-tense inflection (“-ed,”
pronounced with either a /t/ or /d/ phoneme) is ex-
pressed. We can compare the consonant patterns for
these phonemes with the weight pattern.
Press the GoTo button and enter in 600.
Then,
StepTest through the 5 inflections.
Now that we can see that the network has learned the
task, let's try to analyze some of the connectivity to de-
termine how it is working. Because we are most inter-
ested in the past-tense mapping, we will focus on that
first. To find out which hidden units are most selective
for the past-tense inflectional semantics, we will look
at the sending weights from the past-tense inflectional
semantics units to the hidden layer.
, !
Press View button on the pt_ctrl control panel,
and select CONSONANTS . Click on the t and then with
the middle button (or shift and left button) on d and n .
You should see that the pattern of weights is consis-
tent with the idea that this past-tense hidden unit pro-
duces the regular “-ed” inflection. It is not immedi-
ately clear why the /n/ phoneme is also supported by
the weights, but it could just be because it is very sim-
ilar overall to the /t/ and /d/ phonemes. Next, we can
look at the final inflectional phoneme slot.
, !
Click on s.wt in the network (sending weights), and
then click on the first past-tense inflectional semantics
unit (fifth unit from the left in the second-to-last row).
One interesting question is whether there is consis-
tency in these weights across all 4 past-tense inflection
units — we would expect this from Hebbian learning,
but not necessarily in a purely error-driven network that
, !
Search WWH ::




Custom Search