Digital Signal Processing Reference
In-Depth Information
Therefore, from summing the above statements it follows that
the covariance
equals to zero if the random variables are either independent or dependent but
uncorrelated
.
Additionally, from (
3.112
) it follows that
if the random variables are orthogo-
nal, then the covariance is equal to the negative product of their mean values
.
C
X
1
X
2
¼EfX
1
gEfX
2
g:
(3.124)
3.3.3.1 Variance of the Sum of Random Variables
Consider the sum of two random variables
X
1
and
X
2
, which is itself a random
variable
X
:
X ¼ X
1
þ X
2
:
(3.125)
using (
3.88
) and (
3.125
), we get:
2
2
s
X
¼ ðX XÞ
¼ ðX
1
þ X
2
X
1
þ X
2
Þ
2
2
¼ ðX
1
X
1
Þ
þ ðX
2
X
2
Þ
þ
2
ðX
1
X
1
ÞðX
2
X
2
Þ:
(3.126)
The first two terms in (
3.126
) are the corresponding variances of the variables
X
1
and
X
2
, respectively, while the third averaged product is the covariance.
Therefore, (
3.126
) reduces to:
s
X
¼ s
X
1
þ s
X
2
þ
2
C
X
1
X
2
:
(3.127)
Equation (
3.127
) states that
the variance of the sum of the variables X
1
and X
2
is
equal to the sum of the corresponding variances if their covariance is equal to zero
(
i.e., the variables are either independent or uncorrelated
).
Therefore, if the random variables
X
1
and
X
2
are either independent or uncorre-
lated
ðC
X
1
X
2
¼
0
Þ
, then
s
X
1
þX
2
¼ s
X
1
þ s
X
2
:
(3.128)
The result (
3.128
) can be generalized to the sum of
N
either independent or
uncorrelated variables
X
1
,
,
X
N
:
...
X
i
¼
X
N
s
2
s
2
X
i
:
(3.129)
P
N
i¼
1
i¼
1
Search WWH ::
Custom Search