2009 IEEE International Conference on Acoustics, Speech and Signal Processing 2009
DOI: 10.1109/icassp.2009.4960552
|View full text |Cite
|
Sign up to set email alerts
|

Combining mixture weight pruning and quantization for small-footprint speech recognition

Abstract: Semi-continuous acoustic models, where the output distributions for all Hidden Markov Model states share a common codebook of Gaussian density functions, are a well-known and proven technique for reducing computation in automatic speech recognition. However, the size of the parameter files, and thus their memory footprint at runtime, can be very large. We demonstrate how non-linear quantization can be combined with a mixture weight distribution pruning technique to halve the size of the models with minimal per… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…In this framework, speech recognition is the process of selecting between a (potentially infinite) set of alternate transcriptions for a sentence. Equivalently, we can think of this as a classification of the acoustic observation, where each class contains all the realizations of a particular sentence [27].…”
Section: Language Modelsmentioning
confidence: 99%
See 3 more Smart Citations
“…In this framework, speech recognition is the process of selecting between a (potentially infinite) set of alternate transcriptions for a sentence. Equivalently, we can think of this as a classification of the acoustic observation, where each class contains all the realizations of a particular sentence [27].…”
Section: Language Modelsmentioning
confidence: 99%
“…maximum likelihood estimation is used, the variance of the resulting probability estimates will be quite large, and the model will also assign zero probability to many plausible word sequences [27].…”
Section: Search Enginementioning
confidence: 99%
See 2 more Smart Citations