Digital Signal Processing Reference
In-Depth Information
where p X(t 1 ) , X(t 2 ) (
is the second-order pdf of the process. We can also
define the autocovariance function
x 1 , x 2 )
E [ X
]
C X
(
t 1 , t 2
) =
(
t 1
)
(
X , t 1
)
][ X
(
t 2
)
(
X , t 2
)
κ 1
κ 1
κ 1 (
=
R X
(
t 1 , t 2
)
κ 1
(
X , t 1
) ·
X , t 2
)
(2.83)
In order to evaluate the second-order moments in different time instants,
let us create a vector x
] T . Then, if we compute
=
[ X
(
t 1 )
X
(
t 2 )
···
X
(
t n )
E xx H for a zero-mean process, we obtain the autocorrelation matrix
E xx H
R xx =
X (
X (
X
(
t 1 )
t 1 )
···
X
(
t 1 )
t n )
. . .
. . .
. . .
=
X (
X (
X
(
t n )
t 1 )
···
X
(
t n )
t n )
R X (
t 1 , t 1 )
···
R X (
t 1 , t n )
. . .
. . .
. . .
=
(2.84)
R X (
t n , t 1 )
···
R X (
t n , t n )
H stands for Hermitian transposi-
tion. The autocovariance matrix is obtained if the autocorrelation function is
replaced by the autocovariance function in (2.84).
Another important measure is the cross-correlation function, which
expresses the correlation between different processes. Given two different
stochastic processes X
In the above definition, the superscript
( · )
(
t
)
and Y
(
t
)
, the two cross-correlation functions can be
defined as [135]
E X
t 2 )
Y (
R XY (
t 1 , t 2 ) =
(
t 1 )
(2.85)
and
E Y
t 2 )
X (
R YX (
t 1 , t 2 ) =
(
t 1 )
(2.86)
We can also define a cross-correlation matrix, given by
R X (
t 1 , t 2 )
R XY (
t 1 , t 2 )
R XY (
t 1 , t 2 ) =
(2.87)
R YX (
t 1 , t 2 )
R Y (
t 1 , t 2 )
So far, there has been a strong dependence of the definitions with respect
to multiple time indices. However, some random signals show regularities
that can be extremely useful, as we shall now see.
Search WWH ::




Custom Search