Information Technology Reference
In-Depth Information
The next test sentence evaluates the online updating
of information in a case where subsequent information
further constrains an initially vague word. In this case,
the sentence starts with the word child ,andthenetwork
vacillates back and forth about which child it answers
the agent question with. When the network receives the
adverb daintiness , this uniquely identifies the school-
girl, which it then reports as the agent of the sentence
(even though it does not appear to fully encode the dain-
tiness input, producing pleasure instead).
Step through this next sentence.
The next two sentences test the network's ability to
resolve ambiguous words, in this case throw and ball
based on the surrounding semantic context. During
training, the network learns that busdrivers throw base-
balls, whereas teachers throw parties. Thus, the network
should produce the appropriate interpretation of these
ambiguous sentences.
Step through these next two sentences to verify that
this is the case.
Note that the network makes a mistake here by re-
placing teacher with the other agent that also throws
parties, the schoolgirl . Thus, the network's context
memory is not perfect, but it tends to make semantically
appropriate errors, just as people do.
The next test sentence probes the ability of the net-
work to instantiate an ambiguous term (e.g., someone )
with a more concrete concept. Because the teacher only
kisses males (the pitcher or the busdriver), the network
should be able to instantiate the ambiguous someone
with either of these two males.
, !
Step through the sentence.
To verify that daintiness is having an effect on this
result, we can run the next control condition where the
pitcher is specified as the agent of the sentence — the
network clearly switches from saying pitcher to saying
schoolgirl after receiving the daintiness input.
, !
Step through the sentence.
The final test sentence illustrates how the network
deals with conflicting information. In this case, the
training environment always specifies that iced tea is
drunk in the living room, but the input sentence says
it was drunk in the kitchen.
As you Step through this sentence, observe that
someoneis instantiated with pitcher.
A similar phenomenon can be found in the role elab-
oration test questions. Here, the network is able to an-
swer questions about aspects of an event that were not
actually stated in the input. For example, the network
can infer that the schoolgirl would eat crackers with her
fingers.
Step through this sentence.
Notice that in the middle of the sentence, the net-
work swaps stirred for drank , only to correct this error
(without further input of drank ) at the end. The main
point is that when kitchen is input, the network responds
with living room , as consistent with its prior knowledge.
This may provide a useful demonstration of how prior
knowledge biases sentence comprehension, as has been
shown in the classic “war of the ghosts” experiment
(Bartlett, 1932) and many others.
Step through the next sentence.
You should see that the very last question regarding
the instrument role is answered correctly with fingers,
even though fingers was never presented in the input.
The next sentence takes this one step further and has the
network infer what the schoolgirl tends to eat (soup).
, !
Nature of Representations
Having seen that the network behaves reasonably (if not
perfectly), we can explore the nature of its internal rep-
resentations to get a sense of how it works. First, we
can probe the way that the Encode layer encodes the
localist input representation of words into more useful
distributed representations.
Go ahead and Step through this one.
, !
Question 10.13 In chapter 9, we discussed a mecha-
nism for using partial cues to retrieve an original stored
memory. (a) Explain the network's role elaboration
performance in terms of this mechanism. (b) Based on
what you know about the rates of learning of different
brain areas, speculate about differences in where in the
brain role elaboration might take place based on how
familiar the information in question is.
Press WordClust on the overall control panel.
There will be a short delay as all the unambiguous
verbs are presented and the corresponding activations
over the encoding layer are recorded, followed by a
, !
Search WWH ::




Custom Search