Cryptography Reference
In-Depth Information
of hash functions introduced by Carter and Wegman [14]. The actual imple-
mentation of privacy amplification, however, will be executed by software
and hardware that selects a particular hash function. The bound on the av-
erage value of the mutual information does not apply to this situation: it
does not directly measure the amount of mutual information available to an
eavesdropper in practical quantum cryptography.
In this section we calculate cryptographically acceptable pointwise
bounds on the mutual information that can be achieved while still main-
taining sufficiently high throughput rates. In contrast to a direct application
of the privacy amplification result of Ref. [12], we must also consider and
bound a probability of choosing an unsuitable hash function and relate this
to cryptographic properties of the protocol and the throughput rate. The re-
lation between average bounds and pointwise bounds of random variables
follows from elementary probability theory, as was also described in Ref. [15].
7.3.1 Privacy Amplification
In ideal circumstances, the outcome of a k -bit key-exchange protocol is a k -bit
key shared between Alice and Bob that is kept secret from Eve. Perfect secrecy
means that from Eve's perspective the shared key is chosen uniformly from
the space of k -bit keys. In practice, one can only expect that Eve's probability
distribution for the shared key is close to uniform in the sense that its Shannon
entropy is close to its largest possible value k . Moreover, because quantum
key-exchange protocols implemented in practice inevitably leak information
to Eve, Eve's distribution of the key is too far from uniform to be usable for
cryptographic purposes. Privacy amplification is the process of obtaining a
nearly uniformly distributed key in a key space of smaller bit size.
We review the standard assumptions of the underlying probability model
of Ref. [12]:
is the underlying sample space with probability measure P .
Expectation of a real random variable X with respect to P is denoted E X . W is
a random variable with key material known jointly to Alice and Bob, and V is
a random variable with Eve's information about W . W takes values in some
finite key space
W
. The distribution of W is the function P
W (w) =
P
(
W
= w)
for
w W
. Eve's distribution having observed a value
v
of V is the conditional
probability P W |
= v (w) =
P
(
W
= w |
V
= v)
on
W
. In the discussion that follows,
V
v
is fixed, and accordingly we denote Eve's distribution of Alice and Bob's
shared key given
by P Eve . H and R denote Shannon and Renyi entropies of
random variables defined on
v
W
relative to P Eve .
Definition 7.1 Suppose
Y
is a key space. If
α
is a positive real number, a mapping
γ
:
1
W Y
is an
α
strong uniformizer for Eve's distribution iff H
(γ ) =
P Eve
y
Y
1
(
y
))
log 2 P Eve
(
y
))
log 2 | Y |− α
.
γ
α
strong uniformizer, then we obtain a bound on the mutual
information between Eve's data V and the image of the hash transformation
Y as
If
is an
I
(
Y, V
) =
I
(
Y
)
H
(
Y
|
V
) =
log 2 | Y |−
H
(γ ) α.
(7.25)
Search WWH ::




Custom Search