Digital Signal Processing Reference
In-Depth Information
Hence, channel entropy is
∞
H
(
x
)
=−
p
(
x
)
log
p
(
x
)
dx
−∞
∞
∞
log
2
(8.36)
x
2
2
σ
=
p
(
x
)
πσ
2
dx
+
p
(
x
)
log
e
2
dx
−∞
−∞
log
2
2
bits
=
π
e
σ
/
message
Now, the signal is band-limited to
ω
Hz, then it may be uniquely specified taking at
samples.
2
Hence the rate of information transmission is
least 2
ω
log
2
2
2
log
2
R
(
x
)
=
2
ω
H
(
x
)
=
2
ω
π
e
σ
2
=
ω
π
e
σ
(8.37)
log
2
2
∴
R
(
x
)
=
ω
π
e
σ
If p(x) is band-limited Gaussian noise with an average noise power N, then
)
∵
σ
N
2
R
(
n
)
=
R
(
x
)
=
ω
log
(
2
π
eN
=
(8.38)
Now, let us consider the case of continuous data transmission through a noisy chan-
nel. If the received signal is composed of a transmitted signal
x
and a noise
n
, then
the joint entropy of the source and noise is given by
R
(
x
,
n
)
=
R
(
x
)
+
R
(
n
/
x
)
(8.39)
In practice, transmitted symbol and noise are independent. Therefore,
R
(
x
,
n
)
=
R
(
x
)
+
R
(
n
)
(8.40)
Considering the received signal y as sum of transmitted signal
x
and noise
n
,
H
(
x
,
y
)
=
H
(
x
,
n
)
H
(
y
)
+
H
(
x
/
y
)
=
H
(
x
)
+
H
(
n
)
(8.41)
)
+
/
)
=
)
+
R
(
y
R
(
x
y
R
(
x
R
(
n
)
The rate at which the information is received from a noisy channel is
R
=
R
(
y
)
−
R
(
x
/
y
)
(8.42)
=
)
−
R
R
(
y
R
(
n
)
2
Nyquist rate of sampling.
Search WWH ::
Custom Search