Database Reference
In-Depth Information
α
α
α
θ
d
θ
d
θ
d
Z
d
,
n
Z
d
,
n
Z
d
,
n
W
d
,
n
W
d
,
n
W
d
,
n
N
N
N
D
D
D
...
β
k
,1
β
k
,2
β
k
,
T
K
FIGURE 4.8
: A graphical model representation of a dynamic topic model
(for three time slices). Each topic's parameters
β
t,k
evolve over time.
topics use a diagonal covariance matrix. Modeling the full
V
V
covariance
matrix is a computational expense that is not necessary for our goals.
By chaining each topic to its predecessor and successor, we have sequen-
tially tied a collection of topic models. The generative process for slice
t
of a
sequential corpus is
×
(
π
t−
1
,σ
2
I
)
1. Draw topics
π
t
|
π
t−
1
∼N
2. For each document:
(a) Draw
θ
d
∼
Dir(
α
)
(b) For each word:
i. Draw
Z
Mult(
θ
d
)
ii. Draw
W
t,d,n
∼
∼
Mult(
f
(
π
t,z
)).
This is illustrated as a graphical model in Figure 4.8. Notice that each time
slice is a separate LDA model, where the
k
th topic at slice
t
has smoothly
evolved from the
k
th topic at slice
t
1.
Again, we can approximate the posterior over the topic decomposition with
variational methods (see (
8
) for details). Here, we focus on the new views of
−
Search WWH ::
Custom Search