Geoscience Reference
In-Depth Information
walk [ 8 , 168 ], harmonic function [ 210 ], local and global consistency [ 198 ], manifold regulariza-
tion [ 17 , 158 , 155 ], kernels from the graph Laplacian [ 35 , 51 , 95 , 101 , 163 , 211 ], spectral graph
transducer [ 90 ], local averaging [ 179 , 187 ], density-based regularization [ 23 , 36 ], alternating min-
imization [ 181 ], boosting [ 39 , 117 ], and the tree-based Bayes model [ 98 ]. The graph construction
itself is important [ 11 , 83 , 82 , 29 , 167 , 196 ]. Some theoretical analysis of graph-based learning can
be found in [ 91 , 178 , 195 ], and applications in [ 73 , 78 , 111 , 102 , 136 , 138 , 139 ].
Many of the graph-based semi-supervised learning algorithms have moderate to high com-
putational complexity, often O(u 2 ) or more. Fast computation to handle large amount of unlabeled
data is an important problem. Efforts toward this end include [ 6 , 57 , 69 , 84 , 123 , 160 , 172 , 192 , 212 ].
Alternatively, one can perform online semi-supervised learning [ 72 ] where the labeled and unlabeled
instances arrive sequentially. They are processed and discarded immediately to keep the computation
and storage requirement low.
Manifold regularization [ 17 ] formally assumes that the marginal distribution p( x ) is supported
on a Riemannian manifold (see Chapter 2 in [ 107 ] for a brief introduction).The labeled and unlabeled
vertices, and hence the graph, are a random realization of the underlying manifold. For simplicity,
we did not introduce this assumption during our discussion.
There are several extensions to the simple undirected graph that encodes similarity between
vertices. In certain applications like the Web, the edges naturally are directed [ 27 , 118 , 200 ]. Some
graph edges might encode dissimilarities instead [ 76 , 171 ]. Edges can also be defined on more than
two vertices to form hypergraphs [ 199 ]. The dataset can consist of multiple manifolds [ 74 , 180 , 197 ].
Search WWH ::




Custom Search