Digital Signal Processing Reference
In-Depth Information
matrix
H
of
J
(
W
) with respect to the
nr
-dimensional vector [
w
1
,
...
,
w
r
]
T
is given by
1
2
7
i
7
T
j
J ¼ d
ij
(
2
CþCWW
T
þWW
T
C
)
þ
(
w
j
Cw
i
)
I
n
þ
(
w
j
w
i
)
CþCw
j
w
i
þw
j
w
i
C:
After evaluating the EVD of the block Hessian matrix
H
at the stationary points
W ¼ U
r
Q
, prove that
H
is nonnegative if
U
r
¼
[
u
1
,
...
,
u
r
]. Interpret in this case
the zero eigenvalues of
H
. Prove that when
U
r
contains an eigenvector different
from
u
1
,
...
,
u
r
, some eigenvalues of
H
are strictly negative. Deduce that all
stationary points of
J
(
W
) are saddle points except the points
W
whose associated
matrix
U
r
contains the
r
dominant eigenvectors
u
1
,
...
,
u
r
of
C
which are global
minima of the cost function (4.14).
Extend the previous results by considering the 2
nr
2
nr
real Hessian matrix
H ¼ 77 J
with
7¼
def
I
,
r
]
T
.
4.9
With the notations of the NP3 algorithm described in Subsection 4.5.1, write
(4.40) in the form
T
T
T
T
[
7
R
,1
,
...
,
7
R
,
r
,
7
I
,1
,
...
,
7
1
b
[
G
1
=
2
(
k
)(
I
n
þab
T
þaaa
T
)
G
T=
2
(
k
)]
1
=
2
þba
T
G
(
k þ
1)
¼
def
1
def
1
def
2
. Then, using the EVD
n
1
e
1
e
1
þn
2
e
2
e
2
of the symmetric rank two matrix
ab
T
with
a¼
b
G
(
k
)
y
(
k
),
b¼
b
G
(
k
)
z
(
k
) and
a¼
kx
(
k
)
k
þba
T
þaaa
T
,prove
n
i
þ
p
,
i ¼
1, 2.
4.10
Consider the following stochastic approximation algorithm derived from Oja's
algorithm (4.30) where the sign of the step size can be reversed and where the
estimate
W
(
k
) is forced to be orthonormal at each time step
def
1
1
=
equalities (4.41) and (4.42) where
t
i
¼
W
0
(
k þ
1)
¼W
(
k
)
+ m
k
[
I
n
W
(
k
)
W
T
(
k
)]
x
(
k
)
x
T
¼
(
k
)
W
(
k
)
W
(
k þ
1)
¼W
0
(
k þ
1)[
W
0
T
(
k þ
1)
W
0
(
k þ
1)]
1
=
2
(4
:
91)
where [
W
0
T
(
k þ
1)
W
0
(
k þ
1)]
1
=
2
denotes the symmetric inverse square root
of
W
0
T
(
k þ
1)
W
0
(
k þ
1). To compute the later, use the updating equation
of
W
0
(
k þ
1) and keeping in mind that
W
(
k
)
is orthonormal, prove
def
mkx
(
k
)
W
(
k
)
y
(
k
)
ky
(
k
)
that
W
0
T
(
k þ
1)
W
0
(
k þ
1)
¼ I
r
+ zz
T
with
z ¼
def
W
T
(
k
)
x
(
k
). Using identity (4.5.9), prove that [
W
0
T
(
k þ
1)
W
0
where
y
(
k
)
¼
def
(
k þ
1)]
1
=
2
¼ I
r
+ t
k
y
(
k
)
y
T
(
k
) with
2
][(1
=
(1
þm
2
t
k
¼
[1
=ky
(
k
)
k
kx
(
k
)
2
)
1
=
2
)
1]. Finally, using the update equation of
W
(
k þ
1), prove that algorithm (4.9.5) leads to
W
(
k þ
1)
¼W
(
k
)
+ m
k
p
(
k
)
y
T
(
k
) with
p
(
k
)
¼
2
W
(
k
)
y
(
k
)
k
ky
(
k
)
k
def
2
][
x
(
k
)
W
(
k
)
y
(
k
)].
+t
k
=m
k
W
(
k
)
y
(
k
)
þ
[1
þt
k
ky
(
k
)
k
Search WWH ::
Custom Search