2000
DOI: 10.1109/5.880081
|View full text |Cite
|
Sign up to set email alerts
|

Progress in dynamic programming search for LVCSR

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
60
0
1

Year Published

2003
2003
2014
2014

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 101 publications
(61 citation statements)
references
References 55 publications
0
60
0
1
Order By: Relevance
“…The number of insertion, deletion and substitution errors is computed using the best alignment between two token sequences: the manually aligned (reference) and the recognized (test). An alignment resulting from search strategies based on dynamic programming is normally used successfully for a large number of speech recognition tasks, (Ney & Ortmanns, 2000). Speech recognition toolkits, such as HTK, (Young et al, 2006), include tools to compute accuracy and related measures on the basis of the transcribed data and recognition outputs using this dynamic programming algorithm.…”
Section: Standard Evaluation Phone Recognition Metricsmentioning
confidence: 99%
“…The number of insertion, deletion and substitution errors is computed using the best alignment between two token sequences: the manually aligned (reference) and the recognized (test). An alignment resulting from search strategies based on dynamic programming is normally used successfully for a large number of speech recognition tasks, (Ney & Ortmanns, 2000). Speech recognition toolkits, such as HTK, (Young et al, 2006), include tools to compute accuracy and related measures on the basis of the transcribed data and recognition outputs using this dynamic programming algorithm.…”
Section: Standard Evaluation Phone Recognition Metricsmentioning
confidence: 99%
“…That is, a standard HMM for one sub-interaction and a dynamic Bayesian network for another sub-interaction can be linked together in a network to represent interactions which include those sub-interactions. We also proposed a onepass DP search [36] based inference algorithm for NDPM. In the proposed NDPM, it is natural to build a cyclic model to represent repetitive sub-interactions in a complex interaction without increasing the computational cost and redesigning the interaction models.…”
Section: Conclusion and Future Researchmentioning
confidence: 99%
“…The solution is based on the best model and state sequence that jointly maximize the likelihood along with the segmental sequence. In order to tackle this problem, we exploit the one-pass DP search [36].…”
Section: B Inference In Ndpmmentioning
confidence: 99%
“…12: end if 13: erase the word graph and delete the dead paths [7] is. After the UWGE process, the lattice is turned into a sausage-like form and keeps every decoding hypothesis.…”
Section: B the Unconstrained Word Graph Expansion Algorithmmentioning
confidence: 99%