Digital Signal Processing Reference
In-Depth Information
Appendices to Chapter 12
12.A Generality of minimum-norm inverse
The zero-forcing constraint is equivalent to GHF = I . That is, G is a left inverse
of HF .Since HF is in general rectangular, this left inverse is not unique. We
now show that there is no loss of generality in restricting the left inverse G to
be the minimum-norm left inverse (see Sec. C.4 in Appendix C). 5 The proof
works even when the channel H is rectangular, so we assume H has size K
×
P
for generality.
The matrix HF has size K
M so that the rank of H is
at least as large as M (as required by the zero-forcing condition). Now, HF has
a singular value decomposition (SVD) of the form
×
M. We need K
Σ
0
V
M×M
HF =
U
K×K
,
(12 . 73)
where U and V are square unitary matrices. Assuming that HF has full rank
M ,the M
M diagonal matrix Σ is invertible. From Appendix C we know that
the unique minimum-norm left inverse of HF has the form
×
G MNLI = V [ Σ 1
0 ] U .
(12 . 74)
On the other hand, the most general left inverse (Appendix C) has the form
[ Σ 1
U
G =
V
A ]
,
(12 . 75)
M×M
K×K
where A is an arbitrary matrix of size M × ( K − M ) . Figure 12.9(a) shows the
transceiver with the above receiver matrix indicated in detail, and Fig. 12.9(b)
shows the receiver with noise q ( n ) only as its input. Let the output of U
in
Fig. 12.9(b) be denoted as w ( n ), that is,
w ( n )= w 1 ( n )
w 2 ( n )
.
(12 . 76)
Since q ( n ) has correlation matrix σ q I , the correlation matrix of w ( n )is
R ww = σ q U U = σ q I = R 11
,
R 12
(12 . 77)
R 21
R 22
where R km = E [ w k ( n ) w m ( n )]. Thus
R 11 = σ q I M , R 22 = σ q I K−M , and R 12 = 0 .
(12 . 78)
5 That is, the MMSE with minimum-norm left inverse is no greater than the MMSE with
the left inverse allowed to be arbitrary.
Search WWH ::




Custom Search