Information Technology Reference
In-Depth Information
4.2.1 Higher-Order Aggregation Function
The human nervous system consists of neuron cells possessing complex structure
with highly complex physiological properties and operation. Any system level inves-
tigation of the nervous system would demand an explanation on the information
processing in the constituent neurons. This requires a description of how the input
signals to the neuron interact and jointly affect the processing. Neuron computation
involves complex processing of a large number of signals before resulting in the out-
put signals. In numerous studies [ 3 , 5 ], it was found that the computational power of a
neuron cell lies in the way of aggregation of synaptic signals in cell body. Attempts to
give a mathematical representation to the aggregation process of synapses continue
to be a fascinating field of work for researchers in the neurocomputing commu-
nity. A neuron is a basic processing unit of an ANN which is characterized by a
well-defined aggregation function of weighted inputs.
An artificial neuron is the simplified model of a biological neuron which can
approximate its functional capabilities. But, for the time being, it is far from clear
how much of this simplicity is justified, since at present we have only a poor under-
standing of neuronal functions in biological networks. A neuron cell in nervous
system has very complex structure and extremely complex physiological properties
and operations. An artificial neuron in the neural network is usually considered as
computational model of a biological neuron, i.e. real nerve cell, and the connection
weights between nodes resemble to synapses between the neurons. The computa-
tional power of a neuron is a reflection of spatial aggregation of input signals in
the cell body. This chapter is focused on the design and assessment of nonlinear
aggregation functions for artificial neurons.
The very generic organism of neuron computation is to use the summation aggre-
gation function. C Koch and T Poggio (1992) explain the relevance of using multi-
plication as a computationally powerful and biologically realistic possibility in com-
putation [ 1 ]. In neurophysiology, the possibility that dendritic computations could
include local multiplicative nonlinearities is widely accepted. Mel and Koch [ 6 ]
argued that sigma-pi units underlie the learning of nonlinear associative maps in
cerebral cortex. This leads us to develop a very flexible aggregation function that
is biologically plausible and more powerful than conventional one. This chapter
presents three new higher-order aggregation functions and exploit the ways in gener-
ating the comprehensive neural units. Figure 4.1 portrays a flexible artificial neuron
with aggregation function
ʩ
.
4.2.2 Why Higher-Order Neurons
A brief survey into wide applications of neural networks points to the fact that,
when the desired mapping is complicated and input dimension is high, it is tough
to foresee how long the learning process of the neural network of conventional neu-
ron will take and weather the learning will converge to an satisfactory result. The
 
Search WWH ::




Custom Search