Cryptography Reference
In-Depth Information
3.1
Computational indistinguishability
Indistinguishable things are identical
(or should be considered as identical).
The Principle of Identity of Indiscernibles
G.W. Leibniz (1646-1714)
(Leibniz admits that counterexamples to this principle are conceivable,
but will not occur in real life because God is much too benevolent.)
A central notion in modern cryptography is that of “effective sim-
ilarity” (introduced by Goldwasser, Micali and Yao (80; 123)). The
underlying thesis is that we do not care whether or not objects are
equal, all we care about is whether or not a difference between the
objects can be observed by a feasible computation. In case the answer
is negative, the two objects are equivalent as far as any practical appli-
cation is concerned. Indeed, in the sequel we will often interchange
such (computationally indistinguishable) objects. Let
X
=
X
n
}
n∈
N
and
Y
=
{Y
n
}
n∈
N
be probability ensembles such that each
X
n
and
Y
n
is a distribution that ranges over strings of length
n
(or polyno-
mial in
n
). We say that
X
and
Y
are
computationally indistinguishable
if for every feasible algorithm
A
the difference
d
A
(
n
)
de
=
{
|
Pr[
A
(
X
n
)=
1]
−
Pr[
A
(
Y
n
)=1]
|
is a negligible function in
n
.Thatis:
Definition 3.1.
(computational indistinguishability (80; 123)):
We
say that
X
=
Y
n
}
n∈
N
are
computationally indistin-
guishable
if for every probabilistic polynomial-time algorithm
D
every
polynomial
p
, and all suciently large
n
,
{
X
n
}
n∈
N
and
Y
=
{
<
1
p
(
n
)
where the probabilities are taken over the relevant distribution
(i.e.,
either
X
n
or
Y
n
)
and over the internal coin tosses of algorithm
D
.
|
Pr[
D
(
X
n
)=1]
−
Pr[
D
(
Y
n
)=1]
|
That is, we can think of
D
as somebody who wishes to distinguish two
distributions (based on a sample given to it), and think of 1 as
D
's