Digital Signal Processing Reference
In-Depth Information
From here, we can easily find the variance as:
2
l 2
1
l 2 ¼
1
l 2 :
s 2
¼ m 2 m 1 ¼
(2.429)
This is the same result as that obtained in ( 2.379 ).
2.8.5 Chebyshev Inequality
Sometimes we need to estimate a bound on the probability of how much a random
variable can deviate from its mean value. A mathematical description of this
statement is provided by Chebyshev inequality .
The Chebyshev inequality states this bounding in two different forms for r.v.
X in terms of the expected value m X and the variance s X .
The Chebyshev inequality is very crude, but is useful in situations in which we
have no knowledge of a given random variable, other than its mean value and
variance. However, if we know a density function, then the precise bounds can be
found simply calculating the probability of a desired deviation from the mean value.
2.8.5.1 First Form of Inequality
The probability that the absolute deviation of the random variable X from its
expected value m X is more than e is less than the variance s X divided by e 2 ,
g s X =e 2
PX
f
j
m X
j e
:
(2.430)
In the continuation, we prove the formula ( 2.430 ).
Using the definition of the variance, we can write:
1
2
s X ¼
ðx m X Þ
f X ðxÞ d x
1
ð
ð
1
m X e
m X þe
2
2
2
¼
ðx m X Þ
f X ðxÞ d x þ
ðx m X Þ
f X ðxÞ d x þ
ðx m X Þ
f X ðxÞ d x:
1
m X e
m X þe
(2.431)
Omitting the middle integral (see Fig. 2.53a ), we can write:
m X e
ð
1
2
2
s X
ðx m X Þ
f X ðxÞ d x þ
ðx m X Þ
f X ðxÞ d x:
(2.432)
1
m X þe
Search WWH ::




Custom Search