Digital Signal Processing Reference
In-Depth Information
()
xk
xk
(
)
+
1
()
() ()
x k
=
=
k
+
k
[8.7]
As
b
+−
(
)
xk M
1
with:
Aa a
=
,
,
1
p
T
j
2
π
f
T
()
j
2
π
f T
s
k
=
α
e
,
,
α
e
pe
[8.8]
1
e
1
p
T
() () (
)
b
k
=
bk
,
,
bk
+
M
1
where:
T
(
12
()
j
2
π
f T
2 2
j
π
f T
jM
π
fT
aa
=
f
=
1,
e
,
e
,
,
e
[8.9]
ie
ie
ie
i
i
We call a ( f i ) the complex sinusoid vector at frequency f i , A the matrix of
dimension ( M, P ) of the complex sinusoid vectors and s ( k ) contains the amplitudes
of the sine waves ( T and H indicate the transpose and respectively the transconjugate
of a vector or of a matrix).
According to the model [8.7], we see that in the noise absence, the vector of the
observations x( k ) of the space M of the vectors of dimension M of complex
numbers belongs to the subspace of dimension P, esp { A } defined by the complex
sinusoid vectors a ( f i ) supposed to be linearly independent. In the noise presence, it is
no longer the case. However, the information that interests us concerning the
frequencies of the sine waves remains limited to this subspace called the signal
subspace (the complex sinusoid space). We call noise subspace the subspace
complementary to esp { A } in
M .
The subspace methods are based on the two fundamental hypotheses that follow:
P < M and the matrix A of dimension ( M, P ) is of full rank. This implies that the
vectors a ( f i ) for i = 1, …, P , are not linearly dependent. It is easy to see, A being a
Vandermonde matrix (see equations [8.8] and [8.9]), that this property is verified as
soon as the f i are all different. This is not the case in a large number of antenna
processing problems where the source vectors a i can have a form which is very
different from that given in [8.9] according to the antenna geometry, the gains of
sensors, the wave fronts, etc. (see [MAR 98], Chapter 23).
Search WWH ::




Custom Search