Information Technology Reference
In-Depth Information
0. neural activation function
A spiking rate code membrane potential point
B interactive bidirectional feedforward
C language generalization nonwords
1. transformation
A emphasizing distinctions collapsing differences
B error driven hebbian task model based
C spiking rate code membrane potential point
2. bidirectional connectivity
A lification pattern completion
B competition inhibition selection binding
C language generalization nonwords
3. cortex learning
A error driven task based hebbian model
B error driven task based
C gradual feature conjunction spatial invariance
4. object recognition
A gradual feature conjunction spatial invariance
B error driven task based hebbian model
Ca
teraction between “attention” and “binding” producing
this increase, we need to test with “binding” alone.
Do an ActProbe with “binding” as the first word set,
and “invariant object recognition” again as the second.
The similarity drops back to .288. Thus, there is
something special about the combination of “attention”
and “binding” together that is not present by using each
of them alone. Now if we instead probe with “attention
competition” as compared to “invariant object recogni-
tion,” we should activate a different sense of attention,
and get a smaller cosine.
Do an ActProbe with “attention competition” as the
first word set, and “invariant object recognition” again as
the second.
The similarity does now decrease, with a cosine of
only around .114. Thus, we can see that the network's
activation dynamics can be influenced to emphasize dif-
ferent senses of a word. Thus, this is potentially a very
powerful and flexible form of semantic representation
that combines rich, overlapping distributed representa-
tions and activation dynamics that can magnify or di-
minish the similarities of different word combinations.
, !
lification pattern completion
5.
attention
A
competition inhibition selection binding
B
gradual feature conjunction spatial invariance
C
spiking rate code membrane potential point
6.
weight based priming
A
long term changes learning
B
active maintenance short term residual
C
fast arbitrary details conjunctive
7.
hippocampus learning
A
fast arbitrary details conjunctive
B
slow integration general structure
C
error driven hebbian task model based
8.
dyslexia
A
surface deep phonological reading problem damage
Question 10.12 Think of another example of a word
that has different senses (that is well represented in this
textbook), and perform an experiment similar to the one
we just performed to manipulate these different senses.
Document and discuss your results.
B
speech output hearing language nonwords
C
competition inhibition selection binding
9.
past tense
A
overregularization shaped curve
B
speech output hearing language nonwords
C
fast arbitrary details conjunctive
Table 10.13: Multiple-choice quiz given to the network
based on this text. The network compares the activation pat-
tern for each answer with that of the question and selects the
closest fitting one.
A Multiple-Choice Quiz
Finally, we can run an automated multiple-choice quiz
on the network. We created ten multiple-choice ques-
tions, shown in table 10.13. Note the telegraphic form
of the quiz, as it contains only the content words that the
network was actually trained on. The best answer is al-
ways A ,and B was designed to be a plausible foil, while
C is obviously unrelated (unlike people, the network
can't pick up on these regularities across test items).
The quiz is presented to the network by first presenting
the “question,” recording the resulting hidden activation
pattern, and then presenting each possible answer and
computing the cosine of the resulting hidden activation
with that of the question. The answer that has the clos-
est cosine is chosen as the network'sanswer.
To run the quiz, first open up a log by doing View ,
QUIZ_LOG . Then, open up a process control panel
by doing View , QUIZ_PROCESS_CTRL .The NEpoch_2
process control panel will appear. Do a ReInit and a
Step .
This presents the first question to the network (“neu-
ral activation function”).
, !
Step 3 more times for each of the possible answers.
After the last step, you should see the
Epoch_2_TextLog update, with a record of
the cosine distances for each of the answers compared
to the question, and the answer that the network came
, !
Search WWH ::




Custom Search