7th International Conference on Spoken Language Processing (ICSLP 2002) 2002
DOI: 10.21437/icslp.2002-395
|View full text |Cite
|
Sign up to set email alerts
|

Integration of supra-lexical linguistic models with speech recognition using shallow parsing and finite state transducers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2003
2003
2003
2003

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 6 publications
0
2
0
Order By: Relevance
“…This too proved to be impractical except for very small domains. A more promising approach was to select a subset of the categories in the NL grammar as classes in a class n-gram, and to expand those classes using RTN's, as exemplified by [4,11]. While this approach gives a performance that compares favorable to that of a standard class n-gram, it too suffers computationally, in part because our FST technology for speech recognition has been optimized for performance on traditional class n-gram's.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This too proved to be impractical except for very small domains. A more promising approach was to select a subset of the categories in the NL grammar as classes in a class n-gram, and to expand those classes using RTN's, as exemplified by [4,11]. While this approach gives a performance that compares favorable to that of a standard class n-gram, it too suffers computationally, in part because our FST technology for speech recognition has been optimized for performance on traditional class n-gram's.…”
Section: Introductionmentioning
confidence: 99%
“…This paper details our work in developing a technique which satisfies all of the required constraints: computational efficiency, ease of maintenance, and high performance. It is similar to the approach in [4,11], except that the RTN is precompiled into an automatically generated set of multi-word units, such that the resulting n-gram is very similar to a standard class n-gram. An important difference is that there is the opportunity to define multiple classes for a single word.…”
Section: Introductionmentioning
confidence: 99%