Information Technology Reference
In-Depth Information
There is a similarity between the problem to be near to ordinary Shannon channel
capacity using constructive error correction codes and the problem to be near key-
capacity using the best key-sharing protocol.
It is well known that contemporary error correcting codes (such for instance as
concatenated and turbo codes) allow to occur code rate very close to channel capacity
for acceptable complexity of encoding/decoding procedures [3]. But it is still open
problem regarding key-capacity. The authors of the current paper presented in [4]
protocol that allows to reach some key-rates. But it gives a good solution only if
p
p , whereas for opposite situation (for instance
p
p =0.01 ), we
<<
=0.1 and
m
m
obtained about 10 6 times less key-rate than key-capacity!
In order for improve these results we exploit two important facts, which have been
proved recently in [5]. The first fact is so called Enhaced Privacy Amplification
Theorem (EPAT) . It says that if x is truly random binary string of the length k ,
xA
~
z
=
where A is (0,1) truly random
k
×
(
l
+
k
)
k
>
l
+
r
,
z
=
zH
matrix,
,
1
H is (0,1),
(
l
+
r
)
×
l
where
matrix containing exactly one 1 in each column and
y
=
zH
H is some (0,1),
(
l
+
r
)
×
r
at most one 1 in each row,
, where
matrix,
2
H , that the expected Shannon information
I
then there exists such the matrix
0
about the string ~ if some binary string x
is known and also the string y , matrices
A ,
H and
H are known, satisfies the inequality
(
k
t
l
r
)
2
c
,
I
(3)
0
β
ln
2
t is the Renyi (or collision) information that is contained in x
where
regarding the
string x ,
β
k , and
l
k
l
is a coefficient that approaches to 0.42 for any fixed r ,as
increase.
The second fact relates to error correcting capability of linear binary
(
k
+
r
,
k
)
-
codes, when k information bits are transmitted over some BSC with the error prob-
ability p and r check bits are transmitted over noiseless channel. For this channel
model it is easy to prove (using time sharing channel concept) that a decreasing to
zero the probability of errors
P after optimum decoding as
k
can be pro-
vided if and only if the following asymptotic condition is true
)
(4)
r
=
kh
(
p
,
m
h is the entropy function given in (1). But we are interested in non- as-
ymptotic case for the model with a noiseless transmission of check bits that just has
been proved in [5]
(
)
where
(5)
kE
(
R
)
P
2
,
[
]
E
(
R
)
=
min
E
(
ρ
)
ρ
(
2
R
1
/
R
where
,
0
ρ
(
0
,
Search WWH ::




Custom Search