Information Technology Reference
In-Depth Information
a
b
c
A1 = 200 A2 = 65
A1 = 200 A2 = 75
A1 = 200
A2 = 75
B2 = .4
B1 = 1.2
B1 = .1
B2 = .025
B2 = .005
B1 = .025
x
x
x
Fig. 4.12 Processing of information by the Pozin neural network
"
#
X
fs j
s i
t ¼
as i B
ð
s i
Þ
½
I i þ
fsðÞ
s i J i þ
:
j
6 ¼
i
Here, basic notation coincides with the notation adopted for Pozin equations,
I i and J i are external stimuli on the excitable and inhibitory neurons.
In this equation, the first term on the right side describes the rate of decay of the
excitation of the i th neuron, the second term limits the value of excitation to a
certain value, and the third term corresponds to the impact on the i th neuron by its
inhibitory environment.
Grossberg showed that neural networks of this type have a short-term memory.
In addition, depending on the form of the function, this network sharpens or
broadens the spatial signal acting on it or enhances its contour (Fig. 4.13 ).
Grossberg noted the analogy between the dynamics of the proposed neural
networks and reaction-diffusion systems used by Gierer and Meinhardt to explain
the biological pattern formation.
Later, this analogy was demonstrated for the chemical reaction-diffusion sys-
tems of the Belousov-Zhabotinsky type. It is easy to see that this analogy has a real
physical-chemical basis. Chemical nonlinear systems are characterized by auto-
catalytic mechanisms (activation in the elementary microvolume) and by signifi-
cantly higher values of diffusion coefficients of the inhibitor compared with the
activator.
Apparently, the neural network architecture of the Belousov-Zhabotinsky media
explains the specifics of elementary operations of image processing by the envi-
ronment (image contour enhancement, amplifying or quenching of its features),
coinciding with the elementary operations of Grossberg networks.
Search WWH ::




Custom Search