Database Reference
In-Depth Information
Algorithm 2 Gibbs sampling algorithm
Gibbs Sampling ICA (GS)
for each node Y i ∈Y
do
{
bootstrapping
}
{
c
}
ompute label using only observed nodes in
N i
compute a i using only
X∩N i
f ( a i )
for n=1 to B do
y i
{
burn-in
}
generate ordering
O
over nodes in
Y
for each node Y i ∈O
do
compute a i using current assignments to
N i
f ( a i )
for each node Y i ∈Y
y i
do
{
initialize sample counts
}
for label l
∈L
do
c [ i, l ]=0
for n=1 to S do
{
}
collect samples
O
Y
generate ordering
over nodes in
for each node Y i ∈O
do
compute a i using current assignments to N i
y i ← f ( a i )
c [ i, y i ]
c [ i, y i ]+1
for each node Y i ∈Y
do
{
compute final labels
}
y i
argmax l∈L
c [ i, l ]
McDowell et al. report that such a “cautious” approach leads to improved
accuracies.
3.4 Approximate Inference Algorithms for Approaches
Based on Global Formulations
An alternate approach to performing collective classification is to define a
global objective function to optimize. In what follows, we will describe one
common way of defining such an objective function and this will require some
more notation.
We begin by defining a pairwise Markov random field (pairwise MRF) (34).
Let G =(
V
,E ) denote a graph of random variables as before where
V
consists
of two types of random variables, the unobserved variables,
Y
, which need to
be assigned values from label set
whose values
we know. Let Ψ denote a set of clique potentials . Ψ contains three distinct
types of functions:
L
, and observed variables
X
For each Y i ∈Y
, ψ i
Ψ is a mapping ψ i :
L→ 0 ,where
0 is the
set of non-negative real numbers.
 
Search WWH ::




Custom Search