Information Technology Reference
In-Depth Information
disruption is a reflection of how tolerant a web is to error and a web's error tolerance
is not a fixed quantity. It is the intrinsic structure of the web, its topology, that often
determines its robustness, which is to say, the web's ability to absorb error and still func-
tion. Part of what we examine in this subsection is how the heterogeneity of complex
webs influences their robustness.
Figure 6.19 depicts two classes of networks. One has a uni-modal distribution of con-
nections indicative of a random network peaking at the average connectivity
.The
other web has an inverse power-law distribution of connections indicative of a scale-
free network. The error tolerance of these two kinds of webs is very different since the
probability of a node in the random web having a connectivity much greater than the
average k
k
decreases exponentially, whereas the probability of there being highly
connected nodes can be substantial in scale-free webs. It is the difference between these
two mathematical forms that enables us to quantify the robustness of the web. The
random web is like the world of Gauss, with all the elements in the web being well
represented by the mode of the distribution. In the scale-free web the connectivity of
various nodes can be significantly different, with some being the airport hubs of Dallas
and Atlanta, and others being the smaller airports in Buffalo and Cleveland. The for-
mer are the well-connected hubs of the web and the latter are members of the sparsely
connected majority.
Albert et al .[ 4 ] were among the first to discuss how to quantify the error or attack
tolerance of webs. They examined the changes in the web diameter, the diameter being
the average distance between any two nodes on the web, when a small fraction of
the number of elements is removed. A general result is that the diameter of the web
increases with the loss of nodes. The node loss translates into a loss in the num-
ber of paths traversing the web and it is the loss of paths that produces an increase
in the web's diameter. Erdös and Rényi [ 15 ] determined that a random network has
Poisson statistics and consequently the number of nodes with high connectivity is expo-
nentially small. As Albert et al .[ 4 ] explain, in a random web the diameter increases
monotonically with the fraction of nodes that fail as a consequence of the exponen-
tial nature of the tail. As pointed out above, the exponential has the property of equal
fractional decreases for equal incremental increases in the dynamical variable. This
property of the exponential is reflected in the reduced ability of the various parts of the
web to communicate with one another, since removing each node produces the same
amount of damage in the web. This is very different from the tolerance expressed in a
scale-free web.
The calculations done by Albert et al .[ 4 ] indicate that as many as 5% of the nodes
can be removed from the web with the transmission of information across the web
unaffected. This is the robust behavior often discussed with regard to scale-free webs
and is a consequence of the inhomogeneous nature of the nodal connectivity. Recall
that in an inverse power-law distribution there are a few nodes with a great many con-
nections, corresponding to the number of very wealthy in the Pareto distribution of
income, but the vast majority of nodes have very few links, corresponding to the much
lower income of the much greater number of people in the middle and lower classes.
Consequently, if nodes fail at random, or are attacked at random for that matter, the
k
Search WWH ::




Custom Search