Database Reference
In-Depth Information
The second eigenvector has three positive and three negative components. It makes the
unsurprising suggestion that one group should be {1 , 2 , 3}, the nodes with positive com-
ponents, and the other group should be {4 , 5 , 6}.
10.4.5
Alternative Partitioning Methods
The method of Section 10.4.4 gives us a good partition of the graph into two pieces that
have a small cut between them. There are several ways we can use the same eigenvectors
to suggest other good choices of partition. First, we are not constrained to put all the nodes
with positive components in the eigenvector into one group and those with negative com-
ponents in the other. We could set the threshold at some point other than zero.
For instance, suppose we modified Example 10.19 so that the threshold was not zero,
but −1 . 5. Then the two nodes 4 and 6, with components −1 in the second eigenvector of
Fig. 10.18 , would join 1, 2, and 3, leaving five nodes in one component and only node 6
in the other. That partition would have a cut of size two, as did the choice based on the
threshold of zero, but the two components have radically different sizes, so we would tend
to prefer our original choice. However, there are other cases where the threshold zero gives
unequally sized components, as would be the case if we used the third eigenvector in Fig.
10.18 .
We may also want a partition into more than two components. One approach is to use
the method described above to split the graph into two, and then use it repeatedly on the
components to split them as far as desired. A second approach is to use several of the ei-
genvectors, not just the second, to partition the graph. If we use m eigenvectors, and set a
threshold for each, we can get a partition into 2 m groups, each group consisting of the nodes
that are above or below threshold for each of the eigenvectors, in a particular pattern.
It is worth noting that each eigenvector except the first is the vector x that minimizes
x T L x , subject to the constraint that it is orthogonal to all previous eigenvectors. This con-
straint generalizes the constraints we described for the second eigenvector in a natural way.
As a result, while each eigenvector tries to produce a minimum-sized cut, the fact that suc-
cessive eigenvectors have to satisfy more and more constraints generally causes the cuts
they describe to be progressively worse.
EXAMPLE 10.20 Let us reconsider the graph of Fig. 10.16 , for which the eigenvectors of
its Laplacian matrix were tabulated in Fig. 10.18 . The third eigenvector, with a threshold
of 0, puts nodes 1 and 4 in one group and the other four nodes in the other. That is not a
bad partition, but its cut size is four, compared with the cut of size two that we get from the
second eigenvector.
Search WWH ::




Custom Search