Cryptography Reference
In-Depth Information
trace some of those influences. Generally speaking, those influences took the form of
setting intuitive goals, providing basic techniques, and suggesting potential solutions
that later served as a basis for constructive criticism (leading to robust approaches).
Classic Cryptography. Answering the fundamental question of classic cryptogra-
phy in a gloomy way (i.e., it is impossible to design a code that cannot be broken),
Shannon [200] also suggested a modification to the question: Rather than asking whether
or not it is possible to break a code, one should ask whether or not it is feasible to break
it. A code should be considered good if it cannot be broken by an investment of work
that is in reasonable proportion to the work required of the legal parties using the code.
Indeed, this is the approach followed by modern cryptography.
New Directions in Cryptography. The prospect for commercial applications was the
trigger for the beginning of civil investigations of encryption schemes. The DES block-
cipher [168], designed in the early 1970s, has adopted the new paradigm: It is clearly
possible , but supposedly infeasible , to break it. Following the challenge of constructing
and analyzing new (private-key) encryption schemes came new questions, such as how
to exchange keys over an insecure channel [159]. New concepts were invented: digital
signatures (cf., Diffie and Hellman [63] and Rabin [186]), public-key cryptosystems
[63], and one-way functions [63]. First implementations of these concepts were sug-
gested by Merkle and Hellman [163], Rivest, Shamir, and Adleman [191], and Rabin
[187].
Cryptography was explicitly related to complexity theory in the late 1970s [39, 73,
147]: It was understood that problems related to breaking a 1-1 cryptographic mapping
could not be
NP
NP
-hardness of the breaking
task was poor evidence for cryptographic security. Techniques such as “ n -out-of-2 n
verification” [186] and secret sharing [196] were introduced (and indeed were used
extensively in subsequent research).
-complete and, more important, that
At the Dawn of a New Era. Early investigations of cryptographic protocols revealed
the inadequacy of imprecise notions of security, as well as the subtleties involved in
designing cryptographic protocols. In particular, problems such as coin tossing over the
telephone [31], exchange of secrets [30], and oblivious transfer [188] (cf. [70]) were
formulated. Doubts (raised by Lipton) concerning the security of the mental-poker
protocol of [199] led to the current notion of secure encryption, due to Goldwasser and
Micali [123], and to concepts such as computational indistinguishability [123, 210].
Doubts (raised by Fischer) concerning the oblivious-transfer protocol of [188] led to
the concept of zero-knowledge (suggested by Goldwasser, Micali, and Rackoff [124],
with early versions dating to March 1982).
A formal approach to the security of cryptographic protocols was suggested in [65].
That approach actually identified a subclass of insecure protocols (i.e., those that could
be broken via a syntactically restricted type of attack). Furthermore, it turned out that
it was much too difficult to test whether or not a given protocol was secure [69]. Recall
that, in contrast, our current approach is to construct secure protocols (along with their
proofs of security) and that this approach is complete (in the sense that it allows us to
solve any solvable problem).
Search WWH ::




Custom Search