Digital Signal Processing Reference
In-Depth Information
Show that
2
y
i
s
2
¼
2
d
i
s
2
þ
2
b
i
s
2
:
h¼
(b)
Introduce the conditional means
m
þ
and
m
2
as
m
þ
¼ E
(
hjd
i
¼þ
1),
m
¼ E
(
hjd
i
¼
1)
where the expectation is with respect to the probability density function of
b
i
. Show that
2
s
2
:
m
þ
¼m
¼
(c)
Introduce the conditional variances
6
2
þ
and
6
2
as
6
2
þ
¼ E
[(
hm
þ
)
2
jd
i
¼þ
1],
6
1
¼ E
[(
hm
)
2
jd
i
¼
1]
:
Show that
4
s
2
6
2
þ
¼ 6
2
¼
so that the conditional mean is half the conditional variance (give or
take a sign factor). What happens to the conditional means and variances
as
s
2
!
0?
3.3
Consider the log likelihood ratio as in Problem 3.2 (dropping the subscript
i
for
convenience)
h¼
log
Pr(
y
j
d
¼þ
1)
2
y
s
2
:
Pr(
yjd ¼
1)
¼
The variable
h
is then characterized by a conditional variance parameter
6
h
¼
4
=s
2
. Assuming
d
is binary and equiprobable [Pr(
d¼þ
1)
¼
Pr(
d¼
1)
¼
2
],
the mutual information between
d
and
y
is expressed in terms of the conditional
distribution Pr(
yjd
)as
1
0
@
ð
1
2
2Pr(
yjþ
1)
Pr(
yjþ
1)
þ
Pr(
yj
1)
dy
I
(
d
,
y
)
¼
Pr(
yjþ
1) log
y
Pr(
yjþ
1)
þ
Pr(
yj
1)
dy
1
ð
2Pr(
yj
1)
þ
Pr(
yj
1) log
A
:
y
Search WWH ::
Custom Search