2011
DOI: 10.1007/978-3-642-23038-7_17
|View full text |Cite
|
Sign up to set email alerts
|

Speeding Up Bayesian HMM by the Four Russians Method

Abstract: Bayesian computations with Hidden Markov Models (HMMs) are often avoided in practice. Instead, due to reduced running time, point estimates -maximum likelihood (ML) or maximum a posterior (MAP) -are obtained and observation sequences are segmented based on the Viterbi path, even though the lack of accuracy and dependency on starting points of the local optimization are well known. We propose a method to speed-up Bayesian computations which addresses this problem for regular and time-dependent HMMs with discret… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
1
1
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 29 publications
0
5
0
Order By: Relevance
“…In recent years, compression has become an effective computational building block to accelerate algorithms in response to ever-increasing data set sizes across a wide range of fields. The main idea is often to avoid re-computation for identical patterns in the data, as in the case for Hidden Markov Models (HMM) with discrete-valued observations whether for the frequentist classical three algorithms [9] or Gibbs sampling for Bayesian HMM [12].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In recent years, compression has become an effective computational building block to accelerate algorithms in response to ever-increasing data set sizes across a wide range of fields. The main idea is often to avoid re-computation for identical patterns in the data, as in the case for Hidden Markov Models (HMM) with discrete-valued observations whether for the frequentist classical three algorithms [9] or Gibbs sampling for Bayesian HMM [12].…”
Section: Discussionmentioning
confidence: 99%
“…Also for discrete observations, Mahmud et al [ 12 ] used transition-emission operators to substantially accelerate Forward-Backward Gibbs (FBG) sampling [ 13 ], a rapidly converging sampler for computations with fully Bayesian HMM, the fourth problem for HMM. There, marginal state probabilities conditioned on the data provides a robust alternative to locally optimal ML estimation followed by Viterbi-computation.…”
Section: Introductionmentioning
confidence: 99%
“…Bayesian methods have obtained a reputation of requiring enormous computational effort and being difficult to use, for the expert knowledge required for choosing prior distributions. It has also been recognized [60,62,73] that they are very powerful and accurate, leading to improved, high-quality results and providing, in the form of posterior distributions, an accurate measure of uncertainty in results. Nevertheless, it is not surprising that a hundred times larger effort in computational effort alone prevented wide-spread use.…”
Section: Discussionmentioning
confidence: 99%
“…Though there are several schemes available to sample q, [58] has argued strongly in favor of Forward-Backward sampling [57], which yields Forward-Backward Gibbs sampling (FBG) above. Variations of this have been implemented for segmentation of aCGH data before [60,62,78]. However, since in each iteration a quadratic number of terms has to be calculated at each position to obtain the forward variables, and a state has to be sampled at each position in the backward step, this method is still expensive for large data.…”
Section: Bayesian Hidden Markov Modelsmentioning
confidence: 99%
See 1 more Smart Citation