Information Technology Reference
In-Depth Information
node (row) i to the node (column) j .Thesum P i = j =1 C ij is thus the
probability of a connection from node i to the other nodes in the network, and
we can define a row-wise Shannon entropy
n
H ( row )=
P i log P i .
(1)
j =1
We note that we could just as well consider a column-wise entropy H ( column )
and that H ( row )= H ( column ) since the matrix is assumed to be symmetric.
The Shannon mutual information , or negative Shannon entropy, contained
in the matrix C is
n
C ij
P i P j .
I ( C )= H ( row )+ H ( column ) − H ( column|row )=
C ij log
(2)
j =1
We note that I ( C ) is independent of the labelling of the nodes of the network.
A more general Renyi entropy of kind q [6] can be defined as follows.
n
1
P j .
H q ( row )=
q log
(3)
1
j =1
The Renyi information of the first kind ( q = 1) is identical with Shannon in-
formation [2,7]. One can in fact view Renyi information as a generalization of
Shannon information. The Renyi formulas above follow as the only formulation
of entropy/information that is consistent with axioms set forth by Kolmogorov
and Nagumo [4,5].
Since Renyi entropy is a generalization of Shannon entropy, we can consider
the entropy of equation (3) and the associated
i,j
1
C ij
.
H q ( column
|
row )=
q log
(4)
1
FromthesewecancomputetheRenyi mutual information I q ( C ) for a connec-
tivity matrix in a manner analogous to that for Shannon information.
One suggestion made by Gudkov et al. is that instead of computing the
entropy functions alone, we could compute the difference between the Renyi
entropies of the second and first kinds as a way of measuring the state of a
network.
3
Calibration
The proposal has been made to use entropy functions to measure the state of
a network. The work of Gudkov et al. has shown that a qualitative change in
the entropy function does arise from a change in the connectivity matrix derived
Search WWH ::




Custom Search