2010
DOI: 10.1016/j.patcog.2009.06.004
|View full text |Cite
|
Sign up to set email alerts
|

Efficient backward decoding of high-order hidden Markov models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(8 citation statements)
references
References 29 publications
0
8
0
Order By: Relevance
“…In contrast to the broad usage of first-order HMMs in applied sciences [66] – [68] , published applications of higher-order HMMs are relatively rare, but they have been demonstrated to be powerful extensions of first-order HMMs for several applications including speech recognition [69] [76] , image segmentation [77] [79] , robotic [80] , handwriting recognition [81] , or DNA and protein sequence analysis [82] – [85] . Extensions of the mathematical theory of first-order HMMs to higher-order HMMs are comprehensively described in [86] – [89] .…”
Section: Introductionmentioning
confidence: 99%
“…In contrast to the broad usage of first-order HMMs in applied sciences [66] – [68] , published applications of higher-order HMMs are relatively rare, but they have been demonstrated to be powerful extensions of first-order HMMs for several applications including speech recognition [69] [76] , image segmentation [77] [79] , robotic [80] , handwriting recognition [81] , or DNA and protein sequence analysis [82] – [85] . Extensions of the mathematical theory of first-order HMMs to higher-order HMMs are comprehensively described in [86] – [89] .…”
Section: Introductionmentioning
confidence: 99%
“…These models achieved interesting results in pattern recognition and knowledge extraction in areas such as : speech recognition [MHK97,PBW98,EdP10], hydrology [LABP12], biology [EAA + 09, ETD + 11] and agronomy [MLB06, LBBS + 06].…”
Section: Hmm2 Propertiesmentioning
confidence: 99%
“…Higher-order HMM generalizes first-order HMM by extending the dependency from one immediate previous state to R states, which is defined by a tuple λ = {A, B} [18], [19], [20]. A is the state transition probability matrix with R + 1 dimensions,…”
Section: Higher-order Hidden Markov Modelmentioning
confidence: 99%
“…. o T }, the state sequence that is most likely to have generated the input sequence o and the likelihood can be calculated using the following algorithm [18], [19], [20], which is denoted as HO-HMM algorithm in this paper. 1) Initialization.…”
Section: Higher-order Hidden Markov Modelmentioning
confidence: 99%