Cryptography Reference
In-Depth Information
side-channel distinguishers with the most performant nonparametric estimators
of mutual information.
Section 2 summarizes the fundamentals of information theory as well as in-
troduces generalized mutual information. In Section 3, we study some of the
most used statistical tests of the the side-channel literature. Section 4 reviews
classical methods of estimation of mutual information, and more particularly,
nonparametric methods. We evaluate the different techniques of estimation in
the context of side-channel analysis on various setups in Section 5. Section 6
concludes the article.
2
Information Theory Framework
Shannon in [26] laid down foundations of information theory in communication
systems. The entropy in a signal corresponds to the quantity of information
it contains. In the context of cryptanalysis and more particularly side-channel
attacks, one is interested in how much information is generated from a cryp-
tographic device. If the device leaks information when it processes a secret, an
attacker could recover the leakage through side-channel analysis and hence ob-
tain information, e.g. bits of the secret. Mutual information is a measure closely
related to entropy. It is a special case of the notion of relative entropy which
records something close to a distance between two distribution functions.
2.1
Basics on Probability Theory
Let X be a random variable which takes on a finite set of values
{
x 1 ,x 2 ,...,x n }
.
Let
P
( X = x i ) be the probability distribution of X . Hence, the function f : x
P
( X = x ) is often called the probability density function (pdf) of X . Similarly,
we define the function F : x
P
( X
x ) as the cumulative distribution function
(cdf) of X .
The entropy of X is defined as
H ( X )=
f ( x )log( f ( x )) .
x
Let H ( X )and H ( Y )betheentropyof X and Y respectively. The joint entropy
of X and Y is defined as
x,y P
H ( X, Y )=
( X = x, Y = y ) log(
P
( X = x, Y = y )) .
The conditional entropy of X given Y ,noted H ( X
|
Y ), is defined as
Y )=
y
H ( X
|
P
( Y = y ) H ( X
|
Y = y ) , with
H ( X
|
Y = y )=
P
( X = x, Y = y ) log(
P
( X = x, Y = y )) .
x
 
Search WWH ::




Custom Search