Information Technology Reference
In-Depth Information
21.3.2 Conditions for Convexity
21.3.2.1 First-Order Condition
=
Function f
is
differentiable
if
dom f
is
open
and
the
gradient
f(x)
( ∂f (x )
∂x 1
, ∂f (x )
∂x 2
,..., ∂f (x )
∂x n
) exists
x
dom f . Differentiable f is convex if and only
if dom f is convex and there holds
f(x) T (y
f(y)
f(x)
+∇
x)
x,y
dom f . This is the first-order condition for convexity .
21.3.2.2 Second-Order Condition
2 f(x) (which
Function f is twice differentiable if dom f is open and the Hessian
2 f(x)
2 f(x) ij
is an n -by- n symmetric matrix with
=
∂x i ∂x j ,i,j =
1 , 2 ,...,n )exists
dom f . Twice differentiable f is convex if and only if dom f is convex and
Hessian
x
2 f(x) is positive semidefinite (i.e.,
2 f(x)
0),
x
dom f .Thisisthe
second-order condition for convexity .
21.3.3 Convex Optimization Problem
An optimization problem is convex if the objective function is convex and the fea-
sible set is also convex. That is, a convex optimization problem is written as
min
x
n f(x),
R
s.t.
g i (x)
0 ,i
=
1 ,...,m
h i (x)
=
0 ,i
=
1 ,...,l.
where f and g i ,i
1 ,...,m are convex.
Note that any local optimum of a convex optimization problem is globally opti-
mal. Here are some examples of convex optimization problem.
=
Linear Programming (LP) A linear programming is a convex optimization prob-
lem with affine objective and constraint functions. Its feasible set is a polyhedron.
n α T x
min
x
+
β,
R
s.t.
g i (x)
h i ,i
=
1 ,...,m,
a i
x = b i ,i =
1 ,...,l,
Search WWH ::




Custom Search