Database Reference
In-Depth Information
We describe two approximate inference algorithms in this chapter and both
of them adopt a similar approach to avoiding the computational complexity
of computing marginal probability distributions. Instead of working with the
probability distribution associated with the pairwise MRF directly (Defini-
tion 3.1) they both use a simpler “trial” distribution. The idea is to design
the trial distribution so that once we fit it to the MRF distribution then it
is easy to extract marginal probabilities from the trial distribution (as easy
as reading off the trial distribution). This is a general principle which forms
the basis of a class of approximate inference algorithms known as variational
methods (15).
We are now in a position to discuss loopy belief propagation (LBP) and
mean-field relaxation labeling (MF).
3.4.1 Loopy Belief Propagation
Loopy belief propagation (LBP) applied to pairwise MRF
is a mes-
sage passing algorithm that can be concisely expressed as the following set of
equations:
G, Ψ
m i→j ( y j )= α
y i ∈L
ψ ij ( y i ,y j ) φ i ( y i )
m k→i ( y i ) ,
y j ∈L
(3.1)
Y k ∈N i ∩Y\
Y j
b i ( y i )= αφ i ( y i )
Y j ∈N i ∩Y
m j→i ( y i ) ,
y i ∈L
(3.2)
where m i→j is a message sent by Y i to Y j and α denotes a normalization
constant that ensures that each message and each set of marginal probabilities
sum to 1, more precisely,
Algorithm 4 Mean-field relaxation labeling
Mean Field Relaxation Labeling (MF)
for each Y i ∈Y
do
{
initialize messages
}
for each y i ∈L
do
b i ( y i )
1
repeat
{
perform message passing
}
for each Y j ∈Y
do
for each y j ∈L
do
αφ j ( y j ) Y i ∈N j ∩Y,y i ∈L
ψ b i ( y i )
ij
b j ( y j )
( y i ,y j )
until all b j ( y j ) stop changing
 
Search WWH ::




Custom Search