Digital Signal Processing Reference
In-Depth Information
1
L s
w
(
n
) =
0 )
S orth w a (
n
).
(5.81)
It is clear that in this case the value of w q is given by L s
0 )
.Thevalue w a (
n
)
should
be obtained in an adaptive fashion using an RLS algorithm.
We define the beam pattern characteristic of the beamformer as
w H s
2
B(θ) =
20 log 10 |
(θ) |
∈[− π/
2
,π/
2
] .
The beam pattern reflects the spatial filtering capabilities of the beamformer, and it
is the usual measure of performance in beamforming problems.
In Fig. 5.4 we show the beam pattern characteristic of the beamformer with
=
L
15, obtained from ( 5.81 ) using and RLS algorithm with forgetting factor
0 ,
45 and
60 .
λ =
.
δ =
.
θ 0 =
θ 1 =−
θ 2 =
0
999 and
0
01. The angle values are
v
2
θ
The components of the noise vector v
(
n
)
have a variance of
σ
=
1, and
σ
=
1,
0
2
θ
2
θ
σ
= σ
=
10. We see two beam pattern characteristics: one after 100 iterations
and another after 1000 iterations of the RLS. We also see the characteristic of the
optimal beamformer which is obtained solving ( 5.80 ). As more input-output pairs
are used, the better the RLS should approximate the optimal solution in ( 5.80 ). We
see that the gain at
1
2
0
θ =
is 0 dB, which corresponds to the restriction in ( 5.80 ).
60 . That is, although those directions
are not known, both the optimal solution and the RLS are able to obtain a solution
which attenuates those unwanted interferences. This is a consequence of the mini-
mum output power solution in ( 5.80 ), which is being well-approximated by the RLS
beamformer w
45 and
We also see deep fades at
θ =−
θ =
(
n
)
.
5.8 Further Comments
Least squares problems are fundamental in science and have a long and rich history.
The method of LS was discovered by Carl Friedrich Gauss (1777-1855) in 1795,
when hewas trying to predict the location of the then newly discoveredCeres asteroid.
Curiously, he did not publish the method.
In any discipline where some quantitative analysis on the generation of real mea-
sured data must be done, LS problems appear naturally. They are a fundamental part
of the area known as regression analysis [ 26 ], which have multiple applications in
areas like engineering, experimental physics and biology, economy, etc. Although
our main interest was a linear regression model as the one in ( 5.2 ), the method of LS
can be extended to nonlinear ones, and is an important tool in the solution of inverse
problems in infinite dimensional spaces [ 27 ].
There is a close relationship between theRLS algorithmand the Kalman filter [ 28 ].
Sayed and Kailath [ 12 ] proved that any problem solved using the RLS algorithm can
be transformed into an equivalent problem which can be attacked using the Kalman
filter formalism. This is indeed a very important result, because it permits to use
 
Search WWH ::




Custom Search