Information Technology Reference
In-Depth Information
In the following, we prove the convergence property of Algorithm 2.1. We first
assume that
{}
k x is an infinite sequence generated by Algorithm 2.1. The following
assumptions are satisfied.
(H 5 ) The objective function
f
( x
)
has a lower bound on
n
R .
(H 6 ) The gradient
g
( x
)
is uniformly continuous on an open convex set D that
{}
contains
k x .
(H 7 ) There exists
2
m
>
0
, for any k ,
P
T
B
P
m
P
.
k
k
k
k
The assumption (H 5 ) is very mild. Because the objective function
f
( x
)
can be
replaced by
f
e , if the assumption is not satisfied.
According to the related reference, we have the following Lemma.
( x
)
Lemma 2.1 [5] . Suppose that (H 2 ), (H 3 ) and (H 4 ) hold for
ω
γ
s
,
and
. Then when
k
k
k
k
I
is sufficiently large, we have
g
c
γ
,
k
3
k
where
c is a constant.
Analogous to the proof of the corresponding Lemmas in [5], we can obtain the
following Lemmas.
>
0
3
{}
Lemma 2.2. Let
k x be an infinite iteration sequence generated by Algorithm 2.1. If
{}
g
( x
)
is uniformly continuous on an open convex set D containing
k x , then there
{
}
exists a constant
σ
>
0
such that
f
+
σ
T
is monotone and non-increasing.
1
l
(
k
)
1
k
{}
)
Further, the sequence
is convergent (which converges to a finite value or
−∞
).
f
l
( k
Without loss of generality, we assume that
lim
f k
l
=
f
*
.
(
)
k
{}
Lemma 2.3. Let
k x be an infinite iteration sequence generated by Algorithm 2.1. If
{}
k
g
( x
)
is uniformly continuous on an open convex set D containing
x
and
lim
f k
=
f
*
(
0
i
M
1
),
then we have
k
K
,
k
i
1
(1)
li ,
1
g
=
0
k
k
K
k
i
(2)
lim
f k
=
f
*
.
k
K
,
k
i
Proof. Now we show that (1) holds in two cases.
Case 1: When
I
K
is an infinite index set, it follows from Lemma 2.1 that
i
lim ,
g
=
0
.
k
k
I
K
k
i
Search WWH ::




Custom Search