Biomedical Engineering Reference
In-Depth Information
reproduction of the entire ensemble from its part. This type of organization of the
associative-projective neural network makes it possible to form the hierarchy as
“part - whole.”
The formation of neural ensembles is ensured by a change in the synaptic
weights between the neurons of one associative field. Since the binary synaptic
weights are used, the degree of an ensemble formation is characterized by the
probability of unit synaptic weights between the neurons belonging to the ensem-
ble. The better formed this ensemble, the higher the probability of establishing a
unit connection between its neurons. Neural ensembles are formed in the associa-
tive field during training. Different training algorithms can be used. The Hebb
training method works very well, as well as training with the delta rule (Widrow
method), the Kohonen method (the process of self-organizing), the training law of
Grossberg, and so on. In our case, we use Hebb's modified rule (Section 5.1.1).
5.2.4 Methods of Economical Presentation of the Matrix
of Synaptic Weights (Modular Structure)
In many tasks of artificial intelligence, there appear problems of constructing
associative neural networks of large dimensionality (containing a large quantity
of neurons). Using fully connected neural networks leads to a quadratic increase of
needed memory depending on the quantity of the neurons. With sufficiently large
network dimensions (up to 10 6 neurons), the required memory becomes extremely
large (up to 10 12 bits). To construct networks of this volume, it is necessary to use
structures that are not fully connected so that the needed memory can grow linearly
with an increase of the quantity of neurons. We will consider two methods of
constructing not fully connected neural networks: stochastic and modular. The
modular method makes it possible to create a network with a constant ratio of the
quantity of synapses to the quantity of neurons for large neural networks. To
evaluate these different methods, we will use the following basic criteria: the
convenience of hardware realization and the speed of the corresponding neurocom-
puter; the memory needed for neural networks; and the possibility of restoring the
neural ensemble using its sufficiently small part.
5.2.4.1 Stochastic not fully connected networks
Before considering the stochastic method, let us recall that APNNs are characte-
rized by the following features:
1. Any information elements (features, objects, relations, scenes, and so on) in such
networks are presented by the neural ensembles.
2. The quantity of neurons entering the ensemble is much less than the total
quantity of neurons in the network.
Search WWH ::




Custom Search