Information Technology Reference
In-Depth Information
7.5 Testable Predictions of the Model
We expect IDyOT's behaviour to produce simulations of several aspects of human
cognition. Methodologically, this is very useful, since it is possible to give an ordered
list of goals, starting with perceptual mechanisms and ending with demonstrable
spontaneous creativity, that can be tested more or less in sequence. Furthermore,
many of the items on this empirical shopping list can be tested against established
data using established paradigms; it is only later on that one encounters the harder-
to-evaluate creative behaviours. Thus, we can develop confidence in the model as a
whole before we attempt the more challenging evaluations. It is worth mentioning
that the effects introduced below are not in themselves novel solutions to problems:
the novelty is in the one central mechanism that gives rise to all these (and more)
phenomena at once.
7.5.1 Segmentation of (at Least) Music and Language
As already discussed, it is well-established in the literature that sequences of musical
[ 36 ] and linguistic [ 6 , 42 , 54 ] symbols can be segmented inways that correspondwith
perceptual chunking [ 18 ] using methods based on information content or entropy,
and it is easy to suggest evolutionary reasons why this should be so: compression on
the basis of information-theoretic structure is likely to yield efficient representations.
The extant methods, however, do not attempt to fit this idea into a larger cognitive
architecture, and use descriptive rules to detect the boundaries given the statistics of
the data; we propose that our approach, in which sequences are buffered until their
consequent predictions have high entropy in comparison with the predictions of other
generators, not only yields this effect as an emergent property, but also accounts for
handling of boundary ambiguity in language. Thus, IDyOT should, without extra
mechanism, be able to model the “garden path” effect, where an utterance such as
The horse raced past the barn fell.
is initially parsed as complete after the word “barn” has been encountered.
IDyOT is able to predict simultaneously and coherently at multiple levels of
abstraction. This is somewhat like processing multiple parses of a sentence in par-
allel, but with a probabilistic optimisation that omits relatively unlikely parses (thus
avoiding combinatorial explosion), and also including abstract-level parsing too.
Thus, both readings of the garden path sentence, above, are parsed in parallel, but the
correct one is relatively unlikely, and the misleading one takes precedence as a con-
scious experience because the (incorrectly predicted) end of the sentence after “barn”
yields a high-entropy prediction. When the alternative reading enters consciousness
as the result of the alternative sentence ending doing the same, a revision is forced
between the previous and the new reading, and their respective interpretations, and
it is this sudden and unexpected change, in the IDyOT view of cognition, that leads
Search WWH ::




Custom Search