Database Reference
In-Depth Information
One iteration of mean field variational inference for LDA
1. For each topic k and term v :
D
N
λ ( t +1)
k,v
1( w d,n = v ) φ ( t )
= η +
n,k .
(4.8)
n =1
d =1
2. For each document d :
(a) Update γ d :
= α k + n =1 φ ( t )
γ ( t +1)
d,k
d,n,k .
(4.9)
(b) For each word n ,update φ d,n :
exp Ψ( γ ( t +1)
d,k
) , (4.10)
Ψ( v =1 λ ( t +1)
φ ( t +1)
)+Ψ( λ ( t +1)
d,n,k
k,w n )
k,v
where Ψ is the digamma function, the first derivative of the log Γ
function.
FIGURE 4.5 : One iteration of mean field variational inference for LDA.
This algorithm is repeated until the objective function in Eq. (4.6) converges.
posterior under the variational distribution—is applicable when the condi-
tional distribution of each variable is in the exponential family. This has been
described by several authors (5; 41; 7) and is the backbone of the VIBES
framework (40).
Finally, we note that the quantities needed to explore and decompose the
corpus from Section 4.2.2 are readily computed from the variational distribu-
tion. The per-term topic probabilities are
λ k,v
v =1 λ k,v
β k,v =
.
(4.11)
The per-document topic proportions are
γ d,k
k =1 γ d,k
θ d,k =
.
(4.12)
The per-word topic assignment expectation is
z d,n,k = φ d,n,k .
(4.13)
 
Search WWH ::




Custom Search