Cryptography Reference
In-Depth Information
where M is the number of symbols or modulation order. (3.13) can then be
written in the form:
j =1 p ( y
d y
+
+
M
|
x j )
1
M
C =log 2 ( M )
···
p ( y
|
x i )log 2
(3.14)
p ( y
|
x i )
i =1
−∞
−∞
N times
According to the additional information available about the transmission, such
as the type of noise on the channel, possible fading, the type of input and output
(continuous, discrete) and the modulation used, (3.14) can be particularized.
Shannon limit of a band-limited continuous input and output Gaus-
sian channel
Consider the case of a Gaussian channel, with continuous input and output.
The Shannon bound [3.3] giving the maximum capacity C of such a channel is
reached taking at its input a white Gaussian noise of null mean and variance σ 2 ,
described by independent probabilities on each dimension, that is, such that:
N
p ( x )=
p ( x n )
n =1
where x =[ x 1 x 2 ...x N ] is the input vector and p ( x n )= N (0 2 ) .Themutual
information is reached for equiprobable inputs, and denoting N 0 / 2 the variance
of the noise, (3.14) after development gives:
log 2 1+ 2 σ 2
N 0
.
N
2
C =
This relation is modified to make the mean energy E b of each of the bits and
consequently the signal to noise ratio
E N 0
. For N=2, we have:
C b =log 2 1+ R E b
N 0
(3.15)
the capacity being expressed in bit per second per Hertz and per couple di-
mension. Taking R =1 , this leads to the ratio E b /N 0 being limited by the
normalized Shannon limit, as shown in Figure 3.2.
Capacity of a discrete input Gaussian channel
The discrete input, denoted x = x i ,i =1 ,
,M , is typically the result of a
modulation performed before transmission. The inputs x i belongtoasetof M
···
Search WWH ::




Custom Search