Cryptography Reference
In-Depth Information
is a specific input. The input size of an instance of a computational problem is the number
of bits required to represent the instance. The output size of an instance of a computational
problem is the number of bits necessary to represent the output. A decision problem is a
computational problem where the output is either “yes” or “no”. As an example, we give
one of the most important definitions in the topic.
Definition 2.1.1 Let G be a group written in multiplicative notation. The discrete loga-
rithm problem ( DLP )is:given g,h
g a .
G find a , if it exists, such that h
=
In Definition 2.1.1 the input is a description of the group G together with the group
elements g and h and the output is a or the failure symbol
).
Typically, G is an algebraic group over a finite field and the order of g is assumed to be
known. We stress that an instance of the DLP, according to Definition 2.1.1 , includes the
specification of G,g and h , so one must understand that they are all allowed to vary (note
that in many cryptographic applications one considers the group G and element g as being
fixed; we discuss this in Exercise 21.1.2 ). As explained in Section 2.1.2 , a computational
problem should be defined with respect to an instance generator; in the absence of any
further information it is usual to assume that the instances are chosen uniformly from the
space of all possible inputs of a given size. In particular, for the DLP it is usual to denote
the order of g by r and to assume that h
(to indicate that h
g
g a where a is chosen uniformly in
Z
Z
.The
output is the integer a (e.g., written in binary). The input size depends on the specific group
G and the method used to represent it. If h can take all values in
=
/r
then one needs at least
log 2 ( r ) bits to specify h from among the r possibilities. Hence, the input size is at least
log 2 ( r ) bits. Similarly, if the output a is uniformly distributed in
g
Z
/r
Z
then the output size
is at least log 2 ( r ) bits.
An algorithm to solve a computational problem is called deterministic if it does not
make use of any randomness. We will study the asymptotic complexity of deterministic
algorithms by counting the number of bit operations performed by the algorithm expressed
as a function of the input size. Upper bounds on the complexity are presented using “big
O ” notation. When giving complexity estimates using big O notation we implicitly assume
that there is a countably infinite number of possible inputs to the algorithm.
Definition 2.1.2 Let f,g :
N → R > 0 . Write f
=
O ( g ) if there are c
∈ R > 0 and N
∈ N
such that
f ( n )
cg ( n )
for all n
N .
Similarly, if f ( n 1 ,...,n m ) and g ( n 1 ,...,n m ) are functions from
N
m to
R > 0 then we
write f
=
O ( g ) if there are c
∈ R > 0 and N 1 ,...,N m ∈ N
such that f ( n 1 ,...,n m )
m with n i
cg ( n 1 ,...,n m ) for all ( n 1 ,...,n m )
∈ N
N i for all 1
i
m .
Example
2.1.3
3 n 2
+
2 n
+
1
=
O ( n 2 ), n
+
sin( n )
=
O ( n ), n 100
+
2 n
=
O (2 n ),
log 10 ( n )
=
O (log( n )).
Search WWH ::




Custom Search