Information Technology Reference
In-Depth Information
(Amari and Maginu,1988), resonating neural networks (Grossberg, 1988),
feedforward networks (Werbos, 1974), associative memory networks (Kohonen,
1989), counterpropagation networks (Hecht-Nielsen, 1987a), recurrent networks
(Elman, 1990), radial basis function networks (Broomhead and Lowe, 1988),
probabilistic networks (Specht, 1988), etc . Nevertheless, up to now, the most
comprehensively studied and, in engineering practice, most frequently used neural
networks are the multilayer perceptron networks (MLPN) and radial basis function
networks (RBFN), which are frequently the subject of further research and
applications.
Neural networks have, since the very beginning of their practical application,
proven to be a powerful tool for signal analysis, features extraction, data
classification, pattern recognition, etc . Owing to their capabilities of learning and
generalization from observation data, the networks have been widely accepted by
engineers and researchers as a tool for processing of experimental data. This is
mainly because neural networks reduce enormously the computational efforts
needed for problem solving and, owing to their massive parallelity, considerably
accelerate the computational process. This was reason enough for intelligent
network technology to leave soon the research laboratories and to migrate to
industry, business, financial engineering, etc . For instance, the neural-network-
based approaches developed and the methodologies used have efficiently solved
the fundamental problems of time series analysis, forecasting, and prediction using
collected observation data and the problems of on-line modelling and control of
dynamic systems using sensor data.
Generally speaking, the practical use of neural networks has been recognized
mainly because of such distinguished features as
x general nonlinear mapping between a subset of the past time series values
and the future time series values
x the capability of capturing essential functional relationships among the
data, which is valuable when such relationships are not a priori known or
are very difficult to describe mathematically and/or when the collected
observation data are corrupted by noise
x universal function approximation capability that enables modelling of
arbitrary nonlinear continuous functions to any degree of accuracy
x capability of learning and generalization from examples using the data-
driven self-adaptive approach.
3.2 Basic Network Architectures
The model of the basic element of a neural network i.e . the neuron, as still used
today was originally worked out by Widrow and Hoff (1960). They considered the
perceptron as an adaptive element bearing a resemblance to the neuron (Figure
3.1). A neuron, as the fundamental building block of a neural information
processing system, is made up of (see Figure 3.1)
x a cell body with an inherent nucleus
Search WWH ::




Custom Search