Information Technology Reference
In-Depth Information
observations X 1:t ¼ x 1 ; ... ; x t
f
g up to time t. The posterior can be updated using
the following expression
P ð y t þ 1 j X 1:t Þ/ X
yt
P ð y t j X 1:t 1 Þ P ð x t j y t Þ P ð y t þ 1 þ y t Þ
ð 1 : 11 Þ
The prediction of future events x t þ 1 ; ... ; x t þ k ; k [ 0 ; conditioned on X 1:t is
through the posterior over y t
P ð x t þ 1 ; ... ; x t þ k j x 1:t Þ/ P ð y t þ 1 j X 1:t Þ P ð x t þ 1 ; ... ; x t þ k j y t þ 1 Þ
ð 1 : 12 Þ
Thus, the information contained in the observation X 1:t can be captured by a
relatively small hidden state y t þ 1 . Therefore, in order to predict the future, we do
not have to use all previous observations X 1:t excepting its state representation
y t þ 1 . In principle, y t þ 1 may contain a finite history of length k þ 1 ; such as
x t ; x t 1 ... ; x t k : In order to incorporate higher order dependency, a representation
of the form Y t ¼½ y t ; y t 1 ; ... ; y t k can be considered.
The dynamics of the system (transition and observation functions) can fit into
linear or non-linear models. In the first case, the parameters of the model can be
estimated using techniques such as expectation maximization (EM). Nevertheless,
the system dynamics cannot be approximated linearly in many real problems.
Incorporating nonlinearity to the model can be made with different methods: using
probabilistic priors plus parametric models [ 49 ]; particle filtering using a finite
number of samples to represent the posterior distribution that are updated with new
observation arrivals [ 50 ]; and approximating the posterior P ð y t j X 1:t Þ by means of a
mixture of distributions such as mixture of Gaussians (MoG) [ 51 ]. Recently, the
components of the mixture have been relaxed to be non-gaussian components
using ICAMM [ 52 ] in order to model the nonlinear part of the system dynamics.
HMM has evolved through multiple variations and hybrid combinations with
several techniques such as neural networks, weighted transducers, and wavelets.
There is an extensive range of applications for hidden state-based dynamic models.
The objective is to exploit the sequential dependence of the data, which is inherent
in many real-world problems, in a scheme of sequential pattern recognition.
Among the HMM applications are the following: speech recognition [ 53 , 54 ],
video event classification and image segmentation [ 55 ], people recognition [ 56 ],
human motion for robotics [ 57 ], and handwriting identification [ 58 ]. Of particular
relevance is the application of HMM in event-related dynamics of brain oscilla-
tions and, in general, in causality analysis of physiological phenomena [ 59 ]. One
example of these analyses is sleep staging, which is approached using radial basis
function (RBF) networks and HMM for the classification of EEG recordings
measured during afternoon naps in [ 60 ]. In Chap. 7 , we address the sleep staging
problem focusing on the analysis of arousal provoked by apnea.
Search WWH ::




Custom Search