Biomedical Engineering Reference
In-Depth Information
m
m
(
)
(the star denoting transpose and complex
conjugation).
2
W
(
x
(
))
=
a
x
(
0
x
(
0
c
f
x
(
)
i
i
i
i
i
ij
j
j
ij
i
=
1
j
=
1
iv)
d i
(
t
)
M
,
i
=
1
m
,
t
, i.e., bounded-
m
m
[
]
∑∑
+
x
2
(
0
x
2
(
)
ij
j
j
ij
ness of the forcing stimuli.
1
1
(42)
Then there exists a bounded on the whole real
axis solution of (40), which is periodic, almost
periodic respectively if d i ( t ) are periodic, almost
periodic respectively. Moreover this solution is
exponentially stable.
Remark that the two theorems give some
relaxed sufficient conditions on the network's
parameters, which do not impose the symmetry
restriction for the matrix of weights.
In the sequel one considers that the “learning”
process gave some structure for a Hopfield neural
network with two neurons and that we have to
check a posteriori the network behavior. If there
are satisfied the assumptions of Theorem 6 , then
the system has the desired properties. The purpose
of the simulation is twofold: (a) we show that for
periodic and almost periodic stimuli the solution
is bounded on the whole real axis (for the time
variable
Theorem 5 . Consider system (40) with a i > 0 ,
the Lipschitz constants Li i > 0 and c ij such that it
is possible to choose p i > 0 and r ij > 0 in order
to have the derivative functional (42) negative
definite with a quadratic upper bound. Then the
system (40) has a unique global solution
i 1=
which is bounded on and exponentially stable.
Moreover, this solution is periodic or almost peri-
odic according to the character of d i ( t )— periodic
or almost periodic, respectively.
,
x i
( t
)
m
The approach of the second result (Danciu,
2002) is based on the frequency domain inequality
of Popov. Denoting
} { } m j
{
m
i
s
1
K
(
s
)
=
diag
(
s
+
a
)
c
e
ij
i
=
1
ij
i
,
=
1
t ) and it is of the same type as the
stimuli, which means the synchronization of the
response of the network with the time-varying
periodic or almost periodic external inputs ; (b)
the solution is exponentially stable which means
that it attains with exponential speed the same
type of behavior as the stimuli.
The equations of the Hopfield neural network
{ }
m
i
{ } i
1
1
L
=
diag
L
,
=
diag
i
=
1
i
=
1
(43)
the basic result here is (see the proof in Danciu,
2002):
Theorem 6. Consider system (40) under the
following assumptions:
are
i)
0 => ;
ii) The nonlinear functions f i (σ) are globally
Lipschitz satisfying conditions (3) and (4)
with Lipschitz consta nts L i ;
iii) There exist
(
)
a i
i
1
m
x
( )
t
= −
2
x
( )
t
2
f
x
(
t
0,5)
1
1
1
1
(
)
3
f
x
(
t
0, 4)
+
sin( )
t
+
sin(
t
)
2
2
(
)
(
)
x
( )
t
= −
5
x
( )
t
f
x
(
t
0, 7)
3
f
x
(
t
0, 6)
+
2 cos( )
t
2
2
1
1
2
2
0 =≥ such that the fol-
lowing frequency domain condition holds:
i
1
m
where the matrix of the weights
i
2
3
C
=
1
3
1
[
]
1
*
+
(
i
)
+
K
(
i
)
>
0
,
>
0
is not symmetric and the values for the delays
2
(44)
0,5
0, 4
T
=
sec.
0, 7
0, 6
 
Search WWH ::




Custom Search