Database Reference
In-Depth Information
not really permitted in a content-addressable memory), forgetting is function-
ally acceptable if the items deleted do not result in any (or result in only a few)
dead ends in the associative data network. As a solution, Anderson (1983) pro-
posed a frequency-based approach, namely to bleach those data from mem-
ory which are never or very rarely activated by the cognitive operations of
the agent. 21 The procedure resembles garbage collection in programming lan-
guages like LISP, Java, or Oberon.
In a Word Bank, the many dispersed little spaces created by bleaching may
be made reusable by pushing the proplets in a token line to the left, like pearls
on a string (defragmentation). In this way, the intrinsic ordering of the pro-
plets' storage positions is maintained, and the contiguous spaces reclaimed
on the righthand end of the token lines are suitable for new storage at the
now front . Furthermore, the declarative addresses coding inter-proplet rela-
tions (Sect. 4.4) provide a straightforward solution to the recomputation of
the pointers to new physical storage locations, necessary after bleaching and
defragmentation. The periods in which bleaching, defragmentation, and re-
computation are performed in artificial agents may be likened to the periods
of sleep in natural agents.
The handling of the data stream inspired by natural phenomona such as sleep
and forgetting may be complemented by purely technical means. Just as an
aircraft may be designed to maximize payload, range, speed, and profit with
little or no resemblance to the natural prototypes (e.g., birds), there may be
a software routine which is applied whenever the amount of memory in the
Word Bank reaches a certain limit; the procedure compresses the currently
oldest content segment (containing, for example, all proplets with prn values
between 30 000 and 32 000) and moves it into secondary storage without any
deletion (and thus without any need to choose). 22
The only downside to data in secondary storage would be a slight slowdown
in initial activation. Depending on the application, either an artificial agent's
overall memory space may be sized from the beginning to accommodate the
amount of data expected for an average lifetime (for example, in a robot on a
space mission), or the necessary hardware may be expanded incrementally as
needed.
21 For example, we tend to forget minor errors like “I didn't put the garlic on the lamb!” However, the
intensity of a memory in its historical context must also be taken into account in order to handle
examples like Proust's madeleine episode.
Search WWH ::




Custom Search