2016
DOI: 10.1016/j.csl.2015.08.007
|View full text |Cite
|
Sign up to set email alerts
|

Getting more from automatic transcripts for semi-supervised language modeling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 23 publications
0
4
0
Order By: Relevance
“…Latent Semantic Analysis (LSA) [7] and Latent Dirichlet Allocation (LDA) [8] are two widely used techniques for document retrieval, full-text retrieval [9], information retrieval [10] and topic modeling [11]. Similar techniques have been explored for topic identification and dynamic language model adaptation using vector space model [12], LSA [13], relevance language model [14], semi-supervised language models [15] and topic tracking language model [16]. LDA technique has been widely explored to form unsupervised adapted language model [17] and topic-specific language models for inflectional languages [18].…”
Section: Related Workmentioning
confidence: 99%
“…Latent Semantic Analysis (LSA) [7] and Latent Dirichlet Allocation (LDA) [8] are two widely used techniques for document retrieval, full-text retrieval [9], information retrieval [10] and topic modeling [11]. Similar techniques have been explored for topic identification and dynamic language model adaptation using vector space model [12], LSA [13], relevance language model [14], semi-supervised language models [15] and topic tracking language model [16]. LDA technique has been widely explored to form unsupervised adapted language model [17] and topic-specific language models for inflectional languages [18].…”
Section: Related Workmentioning
confidence: 99%
“…This shows a decrease of 1.5% in the WER on the MIT corpus and 1.3% decrease in WER on the corpus of spontaneous Japanese. Novotney et al [9] have explained the benefits of semi-supervised LMs for under-resourced languages and over a range of low resource condition. They have given limitation to back-off LMs and have motivated the robust use of automatic counts prior to the estimated parameters of a log-linear LM [9].…”
Section: Related Workmentioning
confidence: 99%
“…Novotney et al [9] have explained the benefits of semi-supervised LMs for under-resourced languages and over a range of low resource condition. They have given limitation to back-off LMs and have motivated the robust use of automatic counts prior to the estimated parameters of a log-linear LM [9]. Oger and Linarès [10] have used the possibility theory for adapting the LM.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation