Digital Signal Processing Reference
In-Depth Information
Digital data
Digital data
Analog signal
Noise
n
(
t
)
T
Channel
h
c
(
t
)
Transmitter
h
t
(
t
)
Receiver
h
r
(
t
)
Detector
Σ
x
(
t
)
{
x
'
k
}
{
x
k
}
y
(
t
)
Figure 12-1
Communications channel view of a high-speed signaling interface.
1/
f
Periodic
Signal
T
symbol
T
symbol
Random
Data
Signal
1
0
0
0
1
1
0
1
0
1
1
Figure 12-2
Symbol rate illustration for a binary NRZ signal.
bandwidth of the signal is equal to the repetition frequency.
†
Thus, we have two
symbols per cycle.
One of the outcomes of Shannon's work states that the maximum number of
bits per symbol,
B
, that can be transmitted without error is given by
2
log
2
1
1
P
s
P
n
B
=
+
(12-3)
where
P
s
is the average signal power and
P
n
is the noise power. The quantity
P
s
/P
n
is also known as the
signal-to-noise ratio
(SNR). Equation (12-3) assumes
that the noise is Gaussian, meaning that it is constant at all frequencies within
the channel bandwidth, which is a reasonable approximation for digital systems
[Sklar, 2001].
Combining the preceding equations gives the
Shannon-Hartley theorem
,
which expresses the maximum data transfer rate in bits per second (b/s) as a
function of the interconnect channel bandwidth and the SNR:
D
=
BWlog
2
(
1
+
SNR
)
(12-4)
Equation (12-4) shows that we can increase throughput across an interchip
interconnect either by increasing the signal-to-noise ratio or by increasing the
†
A real digital signal contains energy at harmonic frequencies above the fundamental, which we can
estimate from the rise time (BW
=
0
.
35
/t
r
) as derived in Section 8.1.3. In this analysis, however,
we are considering only the fundamental frequency.
Search WWH ::
Custom Search