Information Technology Reference
In-Depth Information
are required to understand the algorithm, so it should be
accessible to a wide audience.
As appropriate for our focus on cognition (we con-
sider perception to be a form of cognition), we empha-
size processing that takes place in the human or mam-
malian neocortex , which is typically referred to simply
as the cortex . This large, thin, wrinkled sheet of neu-
rons comprising the outermost part of the brain plays
a disproportionally important role in cognition. It also
has the interesting property of being relatively homoge-
neous from area to area, with the same basic types of
neurons present in the same basic types of connectivity
patterns. This is principally what allows us to use a sin-
gle type of algorithm to explain such a wide range of
cognitive phenomena.
Interactive, graphical computer simulations are used
throughout to illustrate the relevant principles and how
they interact to produce important features of human
cognition. Detailed, step-by-step instructions for ex-
ploring these simulations are provided, together with a
set of exercises for the student that can be used for eval-
uation purposes (an answer key is available from the
publisher). Even if you are not required to provide a
written answer to these questions, it is a good idea to
look them over and consider what your answer might
be, because they do raise important issues. Also, the
reader is strongly encouraged to go beyond the step-
by-step instructions to explore further aspects of the
model's behavior.
In terms of the detailed organization, part I covers
Basic Neural Computational Mechanisms across five
chapters ( Individual Neurons , Networks of Neurons ,and
three chapters on Learning Mechanisms ), and part II
covers Large-Scale Brain Area Organization and Cog-
nitive Phenomena across five chapters ( Perception and
Attention , Memory , Language ,and Higher-Level Cog-
nition , with an introductory chapter on Large-Scale
Brain Area Functional Organization ). Each chapter be-
gins with a detailed table of contents and an introduc-
tory overview of its contents, to let the reader know the
scope of the material covered. When key words are de-
fined or first used extensively, they are highlighted in
bold font for easy searching, and can always be found
in the index. Simulation terms are in the font as
shown.
Procedural steps to be taken in the explorations are
formatted like this, so it is easy to see exactly what you
have to do, and allows readers who are not running the
model to skip over them.
Summaries of the chapters appear at the end of each
one (this chapter excluded), which encapsulate and in-
terrelate the contents of what was just read. After that,
a list of references for further reading is provided. We
hope you enjoy your explorations!
1.8
Further Reading
The original PDP (parallel-distributed processing) vol-
umes, though somewhat dated, remain remarkably rel-
evant: Rumelhart, McClelland, and PDP Research
Group (1986c), McClelland, Rumelhart, and PDP Re-
search Group (1986).
An excellent collection of the important early pa-
pers in neural networks can be found in Anderson and
Rosenfeld (1988).
For other views on the basic premises of cognitive
neuroscience and levels of analysis, we suggest: Marr
(1982), chapter 1; Sejnowski and Churchland (1989);
Shallice (1988), chapter 2; Posner, Inhoff, Friedrich,
and Cohen (1987); Farah (1994); Kosslyn (1994).
For a developmentally-focused treatment of compu-
tational neural network modeling, see:
Elman et al.
(1996) and Plunkett and Elman (1997).
For other treatments of computational modeling us-
ing artificial neural networks, see: Hertz, Krogh, and
Palmer (1991), Ballard (1997), Anderson (1995), Mc-
Cleod, Plunkett, and Rolls (1998), and Bishop (1995).
For an encyclopedic collection of computational neu-
ral network models and more general brain-level theo-
ries, see Arbib (1995).
Search WWH ::




Custom Search