Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers) 2014
DOI: 10.3115/v1/p14-2100
|View full text |Cite
|
Sign up to set email alerts
|

Enriching Cold Start Personalized Language Model Using Social Network Information

Abstract: We introduce a generalized framework to enrich the personalized language models for cold start users. The cold start problem is solved with content written by friends on social network services. Our framework consists of a mixture language model, whose mixture weights are estimated with a factor graph. The factor graph is used to incorporate prior knowledge and heuristics to identify the most appropriate weights. The intrinsic and extrinsic experiments show significant improvement on cold start users.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
15
0

Year Published

2015
2015
2017
2017

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 12 publications
(15 citation statements)
references
References 19 publications
0
15
0
Order By: Relevance
“…Language model is a fundamental natural language processing problem and has been well studied for many years [Jurafsky and Martin, 2008]. It is natural to extend language models to be personalized or socialized since we know that every person in the world has his/her own speech or writing patterns and can be affected by others.…”
Section: Personalized and Socialized Language Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…Language model is a fundamental natural language processing problem and has been well studied for many years [Jurafsky and Martin, 2008]. It is natural to extend language models to be personalized or socialized since we know that every person in the world has his/her own speech or writing patterns and can be affected by others.…”
Section: Personalized and Socialized Language Modelsmentioning
confidence: 99%
“…Personalized language models were mainly applied to Web search [Croft et al, 2001;Song et al, 2010;Sontag et al, 2012] or collaborative search (where user groups are clustered based on user behaviors instead of explicit connections) [Sun et al, 2005;Teevan et al, 2009;Xue et al, 2009]. Socialized languages models have recently been developed, which were also applied to search problems, but to social media text search [Vosecky et al, 2014;Huang et al, 2014;.The socialization of language models can incorporate social friends' information as a smoothing factor to improve the language sparsity problem.…”
Section: Personalized and Socialized Language Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…It is therefore possible to train personalized LMs because of the reasonable assumption that users with close friend relationships may share common subject topics, wording habits, and linguistic patterns. Personalized LMs are useful in many aspects [19,20,21]. In the area of speech recognition, personalization of LMs has been proposed and investigated for both N-gram-based LMs [22] and recurrent neural networks (RNNLMs) [23] in the very limited previous works.…”
Section: Introductionmentioning
confidence: 99%
“…on the specific tracking task. Social information is incorporated into a factor graph model as features (Huang et al, 2014;Yan et al, 2015). These factor graph model based methods are less efficien…”
Section: Related Workmentioning
confidence: 99%