Information Technology Reference
In-Depth Information
neurons in the network as compared to the conventional neural networks. Interested
readers may find that polynomial feedforward neural networks are appealing due
to the trustworthy theoretical results for their universal approximation capabilities
according to the Weierstrass theorem [ 10 ] and for their generalization power mea-
sured by the Vapnik-Chervonenkis (VC) dimension [ 14 ]. However, they suffer from
typical curse of dimensionality due to a combinatorial explosion of terms as number
of inputs increases, demanding sparseness in representation.
4.2.3 A Critical Review of Higher-Order Neuron Models
A network of artificial neuron is an information processing system. It provides a
practical method for learning and generalization of functions, which are often difficult
for a human observer to predict or understand. Conventional neurons, based on
summation aggregation function, were thoroughly used to solve these problems.
However, networks based on these neurons take a large number of neurons which
increases the complexity of the topology, memory requirement, slow convergence,
stability of the weight update system, and training time in experiments. These issues
becomemore significant when problems considered are in high dimensions. As stated
earlier, an influential framework for dealing the ANN may be characterized at three
levels. These include: computational model of neurons, learning algorithms, and
domain of implementation. The architecture of a neuron decides the computational
power of neuron and the number of neurons in a network in turn complexity of neural
network.
The conventional neurons 1 undergo from these issue because they possess a linear
correlation among input components. In numerous studies it has been observed that
neuron encompassing higher-order correlation among input components outperform
over conventional neuron. Multiplication, being the most basic of all nonlinearities,
has become a natural choice of models trying to include higher-order correlation
in aggregation function. Therefore, product units [ 7 ] appeared as computationally
powerful and biologically plausible extension to conventional neurons. Over the
years, a substantial body of investigations has grown which yielded the wide direc-
tions in structure of higher-order neurons, hence wide range of aggregation function
comprising nonlinear correlation among input signals. But, the fundamental class
of higher-order neural units acquires polynomial weighting aggregation of neural
inputs. The variety of neurobiological evidences [ 1 , 2 ] has also grown to support
the presence of nonlinear integration of synaptic inputs in the neuron cells. In broad
sense the higher-order neural networks represent the same style of computation in
artificial neural networks where neurons involve polynomials, or the neurons are
polynomials themselves, or synaptic connections between neurons involve higher-
order terms, hence higher-order polynomials. Working on this direction, an exten-
1 In this topic, a neuron with only summation aggregation function is referred as 'conventional'
neuron and network of these neurons as 'MLP'.
 
Search WWH ::




Custom Search