Information Technology Reference
In-Depth Information
Specifically, if the hypothetical probability vector is x i =(1,0,0,0…0) then the first
column of the M matrix will give the concentration of probability at the i th node after
that infinitesimal time period. Thus the first column of M is the probability
distribution after an infinitesimal time of that part of the probability that began on
node 1 and likewise for all other nodes thus giving a probability interpretation to each
of the columns of M. Thus each column of M can be treated as a probability
distribution associated with the topology connected to that associated node and
supporting an associated entropy function that reflects the inherent disorder (or order)
after a flow
. Thus the columns of M support a meaningful definition of Renyi
entropies which in turn reflect the Markov transformation to disorder of the topology
near the node for that column. Thus this Renyi entropy on this column can be said to
summarize the disorder of the topology of the connections to that node. It follows that
the spectra of all nodes reflects in some sense the disorder of the entire network.
When sorted in descending order, it represents a spectral curve independent of nodal
ordering and thus independent of the permutations on nodal numbering. That spectral
curve can be summarized by the total value for the entropy of all columns (since
entropy is additive and the column values are totally independent). The structure of
the spectra can also be summarized by the entropy of the entropies in the spectra thus
giving a second variable summarizing the entire topology.
If the connection matrix is symmetric then the graph (network) is said to be
undirected, but if there is some asymmetry, then the graph is at least partially directed
where the flow from i to j is less or greater than the converse flow. If the connection
matrix is not symmetrized then one can capture this asymmetry by resetting the
diagonal values of C to be equal to the negative of all other row values in that row.
Then upon expansion of M = I +
λ
C, the rows are automatically normalized
probabilities that in turn support entropy functions for each row. These row entropy
values form a spectrum which could be sorted by the same nodal values (in order) that
is used to order the column values. This will result in a different spectral curve that is
not necessarily in non-decreasing order for the row entropies. One also can compute
the total row entropy and the entropy if these row entropies as we have done from
columns. If two columns have the same entropy then one can sometimes partially
remove this degeneracy by the values of the associated row entropies.
Thus we suggest that the column and row spectral entropy curves, and the column
and row total entropy and entropy of entropy values, distil essential disorder and order
from the network topology - from n 2 values down to 2n (spectral) values, and finally
to 4 values for the entire network - constitute a set of entropy metrics for the network,
all of which are independent of the nodal ordering (numbering) in the network and
thus indicative of the underlying topology. This analysis is expansive in two ways: (1)
These two spectra and four values can be computed to higher order in
λ
thus
including higher orders of the C matrix approximation for M and thereby
incorporating connections of connections into the metric values. It is with higher
powers of C via larger values of
λ
that we unfold more complex aspects of the
network topology. (2) One can also compute these metric values for each of the Renyi
entropy values. Work by V. Gudkov [4] has found that the order of the Renyi entropy
is equivalent to the Hausdorf dimensionality equation. This opens the possibility that
higher order entropy reveals connections of a 'higher dimensionality' in the network
structure [4, 5].
λ
Search WWH ::




Custom Search