Information Technology Reference
In-Depth Information
recognition come to mind immediately. A geometric and statistical nature of the
pattern processing does not permit to apply directly a well developed techniques
of traditional pattern matching and its complexity analysis. Concerning this sub-
ject of steganography and watermarking we can only state that its complexity
analysis seems to be feasible for the reasons just mentioned.
This or that pattern recognition, mainly statistical in its spirit, seem to be
of considerable practical importance in various access protection problems. One
of actively studied ones is vulnerability analysis of networks. Several complexity
questions concern the analysis of attack graphs ā€” they will be mentioned below.
Another problem for which an appropriate pattern analysis seem to be ad-
equate is the detection of malicious code; however, no complexity results or at
least problem formulations are not known to me. Some considerations used in
static buffer overrun detection could prove to be relevant.
A different kind of access protection is a protection by programming means,
like typing, imposing certain policies and checks, like stack inspection.
Of all the questions listed above only complexity of cryptography and its
applications in protocols is studied since long time.
3 Useful Notions from Cryptography
We mentioned above that results on the security of encryption, if to put aside
information theoretic argument that is applicable to encryptions not widely used
nowadays, are based of intractability hypotheses. What is of larger interest to
the security is the system of notions developed in the theory of cryptography. An
excellent coverage of this topic can be found in [15] and drafts of Chapters 5-7
available at the web page of Oded Goldreich.
In practice, security questions contain some amount of uncertainty: we are
never sure that our system is secure and never sure that its known or probable
insecurity will be exploited. So if to go su ciently far towards building adequate
models of security, we are to take into consideration uncertainty issues. Qual-
itative uncertainty can be described with the probability theory ā€” the most
developed qualitative theory of uncertainty. A hard question how to find initial
probabilities in the computer security context is far from being studied.
The notion of general interest for the security is that of indistinguishability :
objects are considered as computationally equivalent if they cannot be differ-
entiated by any ecient procedure from some class. The precise notion mostly
used in cryptography is that of probabilistic polytime indistinguishability that
is a coarsing of classical statistical closeness: for example, two sets of random
Boolean variables (ensembles)
{Y n } nāˆˆ N are statistically
close if there statistical difference, also called variation distance, is negligible.
Probabilistic polytime indistinguishability for
X
=
{X n } nāˆˆ N and
Y
=
demands that for any
probabilistic polytime algorithm 4 and every positive polynomial
X
and
Y
p
(
n
) the differ-
ence between the probabilities that
Y n as 1
4 There several notions of probabilistic algorithms; in our informal discussion there is
no need to make precisions, see [15].
D
determines the values of
X n and
Search WWH ::




Custom Search