5th International Conference on Spoken Language Processing (ICSLP 1998) 1998
DOI: 10.21437/icslp.1998-667
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear interpolation of topic models for language model adaptation

Abstract: Topic adaptation for language modeling is concerned with adjusting the probabilities in a language model to better reflect the expected frequencies of topical words for a new document. The language model to be adapted is usually built from large amounts of training text and is considered representative of the current domain. In order to adapt this model for a new document, the topic (or topics) of the new document are identified. Then, the probabilities of words that are more likely to occur in the identified … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

1999
1999
2010
2010

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…Inspired by the concept of mixture models, the application of topic mixtures in language modeling tasks can be found decades ago. In their early efforts, manually labeled data is used for training and naive Bayesian text classification method for decoding [6]. An attempt with unsupervised method was made by [7] in which topic decomposition were achieved by vector clustering and Expectation Maximization (EM) algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…Inspired by the concept of mixture models, the application of topic mixtures in language modeling tasks can be found decades ago. In their early efforts, manually labeled data is used for training and naive Bayesian text classification method for decoding [6]. An attempt with unsupervised method was made by [7] in which topic decomposition were achieved by vector clustering and Expectation Maximization (EM) algorithms.…”
Section: Introductionmentioning
confidence: 99%