Information Technology Reference
In-Depth Information
layer
and
CNN1,
CNN2
respectively.
Learning
rule
is
given
as
1
ca
ij
3
out
cnn
ca
i
3
out
cnn
j
Δ
w
=
x
(
t
)
x
(
t
)
.
n
Mutual associative memory process is realized by the output layer of MCNN re-
ceives the output of CNN1 and CNN2, i.e., Eq. (7) and Eq. (8), alternatively. In fact,
when there are two time series patterns are processed by the two CNN layers of
MCNN, one layer of CNN plays a role of static external stimuli meanwhile another
layer of CNN executes chaotic storage or recollection processing dynamically. In
another word, CNN layers fire alternatively in CA3. To switch their roles of CNNs,
we used a simple threshold in [7] [8] [9], however, an emotional control given by
subsection 2.2 and subsection 2.3 can raise the performance of MCNN [10].
CA1 in hippocampus decodes output pattern of MCNN which is expressed by n
neurons to patterns stored by associative cortex CX2 which has N neurons (Eq. (10)).
Hebbian learning rule is described by Eq. (11) where input from CX1 is a teacher
signal (
ca
ij
1
cx
1
w
=
1
.
).
n
j
ca
i
1
ca
ij
1
ca
3
ca
j
3
out
x
(
t
)
=
f
(
w
x
(
t
1
)
+
=
0
ca
1
cx
1
cx
i
1
ca
1
(10)
w
x
(
t
)
θ
)
.
ca
ij
1
ca
3
ca
i
1
ca
j
3
out
Δ
w
=
β
x
(
t
)
x
(
t
)
.
hc
(11)
Where
CNN1 and CNN2 are Adachi & Aihara's CNN proposed in [1] [2] and combined
to each other in our MCNN model [7]. The dynamics are described by Eq. (12) - Eq.
(15).
β
is a parameter of learning rate,
i
=
1
2
,...,
N
.
hc
x
(
t
+
1
=
g
(
y
(
t
+
1
+
z
(
t
+
1
+
γ
v
(
t
+
1
))
.
(12)
i
i
i
i
y
(
t
+
1
=
k
y
(
t
)
α
x
(
t
)
+
a
.
(13)
i
r
i
i
i
n
=
z
(
t
+
1
=
k
z
(
t
)
+
w
x
(
t
)
.
(14)
i
f
i
ij
j
j
1
n
=
*
*
v
(
t
+
1
=
k
v
(
t
)
+
W
x
(
t
)
.
(15)
i
e
i
ij
j
j
1
Where x i ( t ) is the output value of i th neuron at time t , n is the number of neurons of
input layer, w ij is the weight of connection from j th neuron to i th neuron, y i ( t ) denotes
internal state of i th neuron, α is a learning rate of i th neuron, k f , k r , k e are damping
rates, a i is a parameter as the summation of threshold and external input, γ is a rate
of effectiveness from another layer, W * ij denotes the weight of connection from j th
Search WWH ::




Custom Search