Information Technology Reference
In-Depth Information
Alice to Bob in the step 2 of KSA through noiseless public channel she has eventu-
ally a pair of strings
z
=
e
z
=
e
x
e
e
,
, where
and
are
BE
AB
BE
AB
the error patterns in the channel B
E and A
B , respectively. It is easy to show that
is sufficient statistic for Eve with respect to x and hence after a per-
formance of 1-2 steps of KSA there exists a virtual BSC between Alice and Eve with
the error probability
x
=
z
z
p
=
p
(
p
)
+
p
(
p
)
. Then we can find the Renyi
m
w
w
m
information in asymptotic case as follows [6]
(
)
(6)
t c
=
k
1
h
(
p
)
.
I
0
k
We can see from (3) that
as
, if and only if the exponent in (3) is
0
negative, that results in the inequality
(7)
k
t
l
r
>
0
.
c
t by (6) into (7) we get
Substituting r by (4) and
(8)
k
k
(
h
(
p
))
l
kh
(
p
)
>
0
.
m
After simple transforms of (8) and a division by k we results in the inequality
k
<
h
(
p
)
h
(
p
)
,
m
l
that coincides with (1) and hence completes the proof of Theorem 1.
In order to get non-asymptotic results we have to substitute in (3) non-asymptotic
bounds for t and r . As for r we use (5) after a transform that is necessary to ex-
press r as the function of k , p and P . Non-asymptotic probabilistic inequality
for the Renyi information has been obtained in [6]
k
(9)
(
p
δ
)
k
i
k
i
Pr(
t c
(
k
kh
(
p
δ
)
+
log
2
k
))
p
(
p
)
,
i
i
=
0
δ
>
0
where
.
We note that this probability can be refered as probability of the risk
( P that the
inequality (3) is violated. Actually, in order to correctly estimate the exponent in (3)
we need to use so called smooth entropy , while the Renyi entropy is the lower bound
of smooth entropy only [7]. Moreover there is a family of the Renyi entropy of order
α
)
of any random variable X with alphabet
1
(10)
α
.
H
(
X
)
=
log
P
(
x
)
α
Χ
1
α
x
The ordinary Renyi entropy is a particular case of the Renyi entropy of the or-
der
α
=
2
. It has been shown in [7] for some very pecular probability distribution
P Χ
( x
)
α
<
2
that the Renyi entropy of order
can be much better lower bound for
H
2 X
)
smooth entropy than
. But unfortunately, as it was communicated in [8] this
 
Search WWH ::




Custom Search