Digital Signal Processing Reference
In-Depth Information
S
a
j
,
s
b
s
(
α
s
(
t
)
=
1
α
j
(
t
−
1
)
x
t
)
(7.60)
j
=
The according backward probability represents the joint probability of the obser-
vation from time step
t
+
1to
T
:
β
s
(
t
)
=
P
(
x
t
+
1
,...,
x
T
|
s
t
=
s
,
i
).
(7.61)
It can be determined by the recursion:
S
β
j
(
t
)
=
a
j
,
s
b
s
(
x
t
+
1
)β
s
(
t
+
1
).
(7.62)
s
=
1
To compute the probability to be in a state at a given time step, one has to multiply
the forward and backward probabilities:
P
(
X
,
s
t
=
s
|
i
)
=
α
s
(
t
)
·
β
s
(
t
).
(7.63)
By that,
L
st
can be determined by:
P
(
X
,
s
t
=
s
|
i
)
1
L
st
=
P
(
s
t
=
s
|
X
,
i
)
=
=
)
·
α
s
(
t
)
·
β
s
(
t
).
(7.64)
p
(
X
|
i
)
p
(
X
|
i
Assuming the last state
S
at the moment in time of the last observation
x
T
needs to
be taken, the probability
P
(
X
|
M
t
)
equals
α
S
(
T
)
. By that, the Baum-Welch estimation
can be executed as described.
The Viterbi algorithm is usually applied in the recognition phase. It is similar to
the forward probability. However, the summation is replaced by a maximum search
to allow for the following forward recursion:
φ
s
(
t
)
=
ma
j
{
φ
j
(
t
−
1
)
a
j
,
s
}
b
s
(
x
t
),
(7.65)
where
is the ML probability of the observation of the vectors
x
1
to
x
t
and being
in state
s
at time step
t
for a given HMM representing class
i
. Thus, the estimated
ML probability
φ
s
(
t
)
P
(
X
|
i
)
equals
φ
S
(
T
)
.
7.3.2 Hierarchical Decoding
HMM are in particular suited for decoding, i.e., segmenting and recognising con-
tinuous audio streams. In addition, their probabilistic formulation allows for elegant
hierarchical analysis in order to unite knowledge at different levels as stated. Typical