Information Technology Reference
In-Depth Information
genetically defined information and learning. The results of this research that could
be improvised, adapted and embedded into the artificial neural network (ANN) got
accrued with time and strengthened its performance, brought some limitations to
light (Kolmogorov 1957; Werbos 1974; Wang 1992). Before describing the artificial
neurons and their networks, it is necessary to consider the biological origin of the
artificial neuron.
The three prominent lobes—the big brain or the cerebrum, the middle brain or
cerebellum and the hindbrain or medulla oblongata are clearly visible in scientific
study of the structure of the human brain. The medulla synapses with the nerves that
extend into the spinal column and divide and subdivide as they spread out into the
various parts of the body. The nerve endings are fibrous and end in the sense organs.
The set of nerves that carry impulses to the brain from the sense organs are called
sensory nerves while the ones that take the signal from the brain to the muscles are
called mortar nerves. The nerve endings receive impulses and through the network
of nerves and pass them to the brain for interpretation. Studies revealed that different
parts of the brain are responsible for different faculties. The brain is made up of
billions of cells; a rough estimate puts the cells at a hundred billion. Ten billion
of these hundred billion cells are 'Neurons' that are responsible for the different
functions of the brain. A typical neuron has the central cell or the cell body, the axon,
the dendrons that give rise to dendrites. The fibrous dendrites synapse with the cell
body and axon of the neighboring neurons. Typically each neuron has 10,000 neurons
in its vicinity to which it connects. The conduction of electrical impulse through the
structure just described is the mechanism that results in the interpretation of the
signal. Neurons respond to the input signal impinged onto them in a certain fashion
demonstrate the distinguished feature of neuron cell.
The development in computational neuroscience may conceptually be summa-
rized into biologically inspired technique that has great abilities of learning, associa-
tion, and generalization. How the neurons operate in the cell body, how they respond
or 'fire' to an input impulse; this has been realized in many researches in biophysics
but still far from clear understanding. A typical signal to the neuron cell would be
of the order of 40mV, and the response of the Neuron depends on the threshold
voltage it operates with. That is, the Neuron fires if the input voltage is greater than
its threshold and does not otherwise. Various investigations revealed that the neurons
respond according to their characteristic function, also called 'activation function,'
which is a general term to describe how a Neuron responds to a stimulus. It is clear
that the threshold function is a particular kind of activation function. The first design
of ANN came from the study of neuronal arrangement in the brain. The chronolog-
ical listing referenced from Clarke and Dewhurst (1972), Rose and Bynum (1982),
Finger (1994, 2000a), Gross (1998), Marshall andMagoun (1998) brings out the facts
with dates. The choice of aggregation function, activation function and the mecha-
nism by which the update of voltage levels happens inside the brain were borrowed
in designing the learning rules for the ANN. The new facts about the brain throw
more light on the brain's functioning and certainly add to our understanding of its
intricate mechanisms. Moreover, the new insights available in functioning formally
Search WWH ::




Custom Search