Information Technology Reference
In-Depth Information
tual filtered text itself is in eccn_lg_f5.cln ,pro-
viding a sense of the “telegraphic” nature of the input
the network receives.
In addition, the network has no sense of word order or
any other syntactic information, so it is really operating
at the level of the “gist.” That the network is operating
on such minimal input makes its performance all the
more impressive. Nevertheless, achieving substantially
better levels of comprehension will likely require a sub-
stantially more complex network that processes more of
the information available in real texts.
act activation activations algorithm allow already approaches arbitrary areas
argue artificial aspect assess atoms balance based basic behavior bias bio-
logical biologically biology bit body calcium called capable categories cell
channel channels charge chl closed cognition combination combine com-
plicated components computational compute computed concentrations con-
ductance confused consider consistent constantly contain continuous corre-
lations corresponding cpca critical cross current currents demands derivative
described detailed detector detectors determined difference different diffusion
discussed divide division electrical electricity emphasize encoded enter envi-
ronment equal equation etc exceeds excitatory explain extent family fast favor
fires firing flow follow forces form function functioning generally generec
hand hebbian help hypothesis identify imagine implement implementing im-
portant include including inconsistent indicated individual influence informa-
tion inhibitory input inputs integrates integrating integration interactivity in-
terested ion ions kind labor language largely last later leak learning leaving let
level likelihood liquid magnitude major manner matches mathematically mat-
ter mechanisms membrane memories minus model modeling models modi-
fication movement name need negative net network networks neural neuron
neurons non notice now number numbers occur open opened opposite ori-
ented parallel pass pattern phase phonology picture plus positive possibly
potassium potential practical prevents principles processing properties pur-
poses put rapid rate reasons recall referred reflected reflects relevant remains
research responding result rule same saw say see sends separate showed shown
sign simple slow soft special specific specifically states strongest suggested
summarized summary survival synaptic textbook things think threshold time
times towards tradeoff turn type types typically understanding updated vari-
able versa version via vice voltage vowel weights work world writing
10.6.2
Exploring the Model
[Note: this simulation requires a minimum of 128Mb of
RAM to run.]
Open the project sem.proj.gz in chapter_10 to
begin.
As before, the network is a skeleton, and must be
built and trained weights loaded.
, !
Table 10.11: List of words with weights >:5 for the lower-
leftmost hidden unit, as produced by the GetWordRF button.
Do LoadNet on the sem_ctrl control panel.
, !
Individual Unit Representations
terms. For example, you should see the words “act,”
“activation,” and “activations,” in the fields numbered
0-2 in the window. By scrolling through this list you
should see many more examples of this.
To start, let's examine the weights of individual units in
the network.
Select r.wt , and then select various hidden units at
random to view.
You should observe sparse patterns of weights, with
different units picking up on different patterns of words
in the input. However, because the input units are too
small to be labeled, you can't really tell which words a
given unit is activated by. The GetWordRF button (get
receptive field in terms of words) on the sem_ctrl
control panel provides a work-around to this problem.
, !
Question 10.10 List some other examples of roughly
synonymous terms represented by this unit.
This property of the representation is interesting for
two reasons. First, it indicates that the representations
are doing something sensible, in that semantically re-
lated words are represented by the same unit. Second,
these synonyms probably do not occur together in the
same paragraph very often. Typically, only one version
of a given word is used in a given context. For example,
“The activity of the unit is...” may appear in one para-
graph, while “The unit's activation was...” may appear
in another. Thus, for such representations to develop, it
must be based on the similarity in the general contexts
in which similar words appear (e.g., the co-occurrence
of “activity” and “activation” with “unit” in the previous
example). This generalization of the semantic similar-
ity structure across paragraphs is essential to enable the
View the weights for the lower-leftmost hidden unit,
and then hit the GetWordRF button.
A String_Array window will pop up, containing
an alphabetized list of the words that this hidden unit
receives from with a weight above the rf_thresh of
.5 (table 10.11). You can resize this window to see more
of the words at one time, and you can use the middle
mouse button to scroll the text within one field.
One of the most interesting things to notice here is
that the unit represents multiple roughly synonymous
Search WWH ::




Custom Search