2016
DOI: 10.48550/arxiv.1601.01356
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

From Word Embeddings to Item Recommendation

Abstract: Social network platforms can use the data produced by their users to serve them better. One of the services these platforms provide is recommendation service. Recommendation systems can predict the future preferences of users using their past preferences. In the recommendation systems literature there are various techniques, such as neighborhood based methods, machine-learning based methods and matrix-factorization based methods. In this work, a set of well known methods from natural language processing domain… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
18
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(18 citation statements)
references
References 12 publications
0
18
0
Order By: Relevance
“…There are venue recommendation systems which use word/document embedding techniques in the literature. Ozsoy et al [21] made an analogy in between "sentences and all check-ins per user" and "words and individual check-ins" and employed Doc2Vec to make venue recommendations. Manotumruksa et al [15] made venue recommendations by inferring the vector space representations of venues.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…There are venue recommendation systems which use word/document embedding techniques in the literature. Ozsoy et al [21] made an analogy in between "sentences and all check-ins per user" and "words and individual check-ins" and employed Doc2Vec to make venue recommendations. Manotumruksa et al [15] made venue recommendations by inferring the vector space representations of venues.…”
Section: Related Workmentioning
confidence: 99%
“…The embedding techniques, Word2Vec [18] and Doc2Vec [10] are frequently used for making recommendations, e.g. [21], [15], [35], [30]. Although these techniques are powerful at learning the semantic relations among venues and users, a newer method named FastText [2] can be more performant to represent the venues.…”
Section: Introductionmentioning
confidence: 99%
“…Word embeddings have been successfully employed in many NLP and Information Retrieval (IR) tasks. For example, word embedding vectors are applied in information retrieval (Ganguly et al 2015), recommendation systems (Ozsoy 2016), text classification (Ge and Moh 2017), etc.…”
Section: Word Embeddingmentioning
confidence: 99%
“…Barkan et al [2] demonstrated that learning latent embeddings, using W 2V , for items can enable the inferring of item-to-item relations even in the absence of user information. Similar approach is also used by Ozsoy et al [22]. However, these models use W 2V to learn the item embeddings which treat each user session separately and there is no connection between the context in di erent sessions.…”
Section: Related Workmentioning
confidence: 99%