Cryptography Reference
In-Depth Information
and so
x 15841 · 22 1
(mod (2 15
1))
x 26040
( x 217 ) 120 .
g
=
=
=
Similarly, G ( x ) 4
1)( x 3
x 2
x 15 + 15345
x 15360
x 3840 .
=
( x
+
+
+
1)
=
=
and
so G ( x )
=
Finally,
g 30 x 6 ( x
1) 4 G ( x )
x 30 · 26040 + 6 + 4 · 15 + 3840
x 9114
( x 217 ) 42
h
=
+
=
=
=
g 42 · 120 1
(mod 151)
g 23 .
and so h
=
=
F q where q
2 n in
Conjecture 15.5.21 Coppersmith's algorithm solves the DLP in
=
L q (1 / 3 , (32 / 9) 1 / 3
+
o (1)) bit operations as n
→∞
.
2 67 .
This conjecture would hold if the probability that the polynomials C ( x ) and D ( x )are
smooth was the same as for independently random polynomials of the same degr ee. We
now give a justification fo r th e constant. Let b
=
2 1024 then L q (1 / 3 , (32 / 9) 1 / 3 )
Note that, to compare with Exercise 15.2.12 ,if q
n/b
cn 1 / 3 log( n ) 2 / 3 . Note that 2 k
=
nb . We need around 2 b /b relations, and note that log(2 b /b )
( n/c log( n )) 1 / 3
and l
c log(2) n 1 / 3 log( n ) 2 / 3 .Wehavedeg( C ( x ))
b log(2)
=
d A +
l and deg( D ( x ))
k d .T he
n/b
number of trials until C ( x )is b -smooth is u u where u
=
( d A +
l ) /b
l/b
=
3 c n 1 / 3 log( n ) 2 / 3 . Similarly, the number of
trials until D ( x )is b -smooth is approximately u u where u
c ( n/ log( n )) 1 / 3 . Hence, log( u u )
1
1
=
u log( u )
n/b and
the same argument applies. Since both events must occur the expected number of trials
to get a relation is exp(
(2 k d ) /b
2 k
=
3 c ( n/ log( n )) 1 / 3 ). Hence, total expected time to generate enough
2
relations is
exp ( c log(2)
3 c ) n 1 / 3 log( n ) 2 / 3 .
2
+
This is optimised when c 3 / 2 log(2)
=
2 / 3, which leads to the stated complexity for the first
stage of the algorithm. The linear algebra is O ((2 b /b ) 2 + o (1) M (log( r ))) bit operations, which
is the same complexity, and the final stage of solving the DLP has lower complexity (it is
roughly the same as the cost of finding polynomially many smooth relations, rather than
finding 2 b /b of them). For more details about the complexity of Coppersmith's method we
refer to Section 2.4 of Thom´e[ 546 ].
Since one can detect smoothness of polynomials in polynomial-time it is not necessary,
from a complexity theory point of view, to sieve. However, in practice sieving can be
worthwhile and a method to do this was given by Gordon and McCurley [ 240 ].
Coppersmith's idea is a special case of a more general approach to index calculus
algorithms known as the function field sieve . Note that Coppersmith's algorithm only has
one factor base, whereas the function field sieve works using two factor bases.
Search WWH ::




Custom Search