Digital Signal Processing Reference
In-Depth Information
where E s represents its eigenvector matrix, and the normalized eigen-
vectors corresponding to a particular eigenvalue are
e s = α A T e
(14.7)
with α a normalizing constant.
Concerning square ( m = n ) BSS problems, it can be seen from
equation(14.5) that the eigenvector matrix E forms an estimate of the
inverse of the mixing matrix A if the matrix E s corresponds to the
identity matrix or a simple permutation matrix. This occurs if the source
signal pencils are both diagonal.
With nonsquare mixing matrices, equation (14.6) can be rewritten in
block matrix notation if A and E are both divided into two blocks:
A into A H , ( n
×
n )and A L , (( m
n )
×
n ), and E into E H , ( n
×
m )
and E L , (( m
n )
×
m ). Then the eigendecomposition statement can be
reformulated as
A H R s1 Φ
=
A H R s2 E s Λ
A L R s1 Φ
=
A L R s2 E s Λ
(14.8)
A H E H + A L E L = A T E
E s
=
(14.9)
and E s
m ) matrix representing the eigenvector matrix
of the source signal pencil having ( m
is now an ( n
×
n ) columns of zeros paired with
the corresponding eigenvalues in Λ that do not belong to the eigenvalue
decomposition of the source signal pencil ( R s1 , R s2 ).
Since after the separation ( m
n ) signals have vanishing amplitudes
this approach also allows one to estimate the number of source signals.
If the latter is known, then a subset of n sensor signals can be used to
compute the corresponding matrix pencil, and identical results will be
obtained. In summary, the GEVD approach to BSS problems is feasible
if the congruent source signal pencils are formed with statistically in-
dependent source signals yielding the identity matrix or a permutation
matrix only.
Search WWH ::




Custom Search