Cryptography Reference
In-Depth Information
Al Jabri showed that if
hc
T
=1forsome
h
∈
H
w
then the non-zero bits of
h
give some information about the non-zero bits of
e
. Overbeck has improved
this algorithm by also using the vectors
h
where
hc
T
= 0 to gain information
about
e
.
We will briefly describe Overbeck's algorithm, and then generalize it to codes
over non-binary fields
F
q
.
3.1 Binary Statistical Decoding
F
q
,
w<n/
2 be an integer, and
H
w
⊆C
⊥
a
suciently large subset of the dual space of
Let
C
be an (
n,k,t
)codeover
C
,where
∀
h
∈
H
w
:wt(
h
)=
w
.
Given a word
c
=
x
+
e
,where
x
=
mG
∈C
and wt(
e
) is small, the algorithm
attempts to find
e
.
For every
h
∈
H
w
,wehavean
odd error detection
at bit
i
if
hc
T
=1and
h
i
=1,andan
even error detection
at bit
i
if
hc
T
=0and
h
i
= 1. In each case
we can compute the probabilities that
e
contains an error at bit
i
.Inthecase
of an odd error detection, the probabilities
p
+
w
and
q
+
w
that
e
i
=1and
e
i
=0,
respectively, are
=
≤
j
odd
n
−
t
t
−
1
j
−
1
=
≤
j
odd
n
−
t
−
1
w
−
j
−
1
j
≤
j
odd
n
−
t
w
−
j
p
+
w
,
+
.
≤
j
odd
n
−
t
j
j
w
w
−
j
w
−
j
Let
v
+
y,w
|{
h
∈
H
w
:
hc
T
=
=0
}|
. For every bit
i
, the random variable
1
v
+
(
hc
T
mod 2)
h
i
y,w
h
∈
H
w
is the relative frequency estimate for
p
+
w
or
q
+
w
, depending on whether
i
is an error
position of
e
. The variance of this random variable is (
σ
+
w
)
2
=
p
+
w
−
p
+
w
)
/v
+
y,w
(1
.
Thus, for
H
w
large enough, Algorithm 1 allows to recover
m
.
Algorithm 1.
Al Jabri's algorithm for binary statistical decoding.
INPUT: Generator matrix
G
for an (
n,k,t
)code,
H
w
⊆C
⊥
and
c
∈{
0
,
1
}
n
OUTPUT:
m
∈{
0
,
1
}
k
such that wt(
c
−
mG
)
≤
t
v
←
h
∈
H
w
(
hc
T
mod 2)
h
∈
Z
n
Choose
I
=
{
positions of the
k
smallest entries of
v
}
s.t.
G
·
I
is invertible
m
←
c
I
G
−
1
·
I
Return
Overbeck improved this algorithm in two ways. First, even error detections are
used as well, allowing to extract significantly more information from a given set