Digital Signal Processing Reference
In-Depth Information
T ) 1 /∂x rs simplifies to
Substituting this into Eq. (20.15), Tr( XAX
Tr ( XAX
( XAX
T ) 1
T + XA I
rs
T ) 1
I rs AX
Tr
( XAX
T + XA I
rs
T ) 2
=
I rs AX
Tr
Tr
T ( XAX
T ) 2
rs ( XAX
T ) 2 XA
=
I rs AX
I
( XAX
T ( XAX
T ) 2
T ) 2 XA
=
AX
sr
,
rs
where we have used Eqs. (20.13) and (20.11). This proves that
T ) 1
Tr( XAX
T
T ) 2 XA
T
T ) 2 XA .
=
( XA
X
( XAX
(20 . 16)
X
A slight variation of this derivation can be used to show that
Tr( B ( XAX
T ) 1 )
T
T ) 1 B
T ( XA
T
T ) 1 XA
T
=
( XA
X
X
X
T ) 1 B ( XAX
T ) 1 XA .
( XAX
(20 . 17)
Tables 20.1 and 20.2 at the end of this chapter summarize the important defini-
tions and formulas discussed in this section.
20.3 Complex gradient operators
Derivatives and gradients have been used in optimization theory for many years.
Most of the theory was initially developed for the case of functions of real vari-
ables. But in digital communications, array processing, and adaptive signal
processing, real functions of complex variables occur frequently. Even though
the real and imaginary parts can be separated out and regarded as independent
variables, it is more elegant and economical to introduce complex differentiation
operators. This is complicated by the fact that the objective functions are not
usually analytic functions of the complex variables involved. The conventional
definition of derivatives from analytic function theory is therefore not useful.
In 1983, D. H. Brandwood introduced the idea of complex gradient operators
in the array processing literature. Since its introduction, this has found wide
application in all areas of signal processing where optimization of complex pa-
rameters is involved. This topic has been briefly covered in the appendices of
some signal processing topics [Moon and Stirling, 2000], [Haykin, 2002], [Sayed,
2003]. Readers interested in a detailed and clear exposition should also read the
original article [Brandwood, 1983].
 
Search WWH ::




Custom Search