2020 International Joint Conference on Neural Networks (IJCNN) 2020
DOI: 10.1109/ijcnn48605.2020.9206871
|View full text |Cite
|
Sign up to set email alerts
|

Using Word2Vec Recommendation for Improved Purchase Prediction

Abstract: Purchase prediction can help e-commerce planners plan their stock and personalised offers. Word2Vec is a well-known method to explore word relations in sentences for sentiment analysing by creating vector representation of words. Word2Vec models are used in many works for product recommendations. In this paper, we analyse the effect of item similarities in the sessions in purchase prediction performance. We choose the items from different position of the session, and we derive recommendations from selected ite… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
2

Relationship

1
9

Authors

Journals

citations
Cited by 21 publications
(10 citation statements)
references
References 40 publications
0
8
0
Order By: Relevance
“…The word embedding learns linguistic regularities and semantics from the sentences and represents the words by vectorized representations [24]. Recently, some of the recommendation methods [24,26,4] used techniques from Word2Vec to represent text-based features, and some of the recommendation algorithms [25,13] applied the techniques to represent items.…”
Section: Recommendation With Word Embeddingmentioning
confidence: 99%
“…The word embedding learns linguistic regularities and semantics from the sentences and represents the words by vectorized representations [24]. Recently, some of the recommendation methods [24,26,4] used techniques from Word2Vec to represent text-based features, and some of the recommendation algorithms [25,13] applied the techniques to represent items.…”
Section: Recommendation With Word Embeddingmentioning
confidence: 99%
“…There are various embedding models that learn item representations from words, sentences, or paragraphs. Among them, word embedding models such as Word2Vec [22], GloVe [25] and FastText [14] are widely used for many recommendation tasks [8], but they have limitations: the order of words is ignored, which leads to the loss of the syntactic and semantic meaning in sentences [41]. We therefore considered two state-of-the-art sentence embedding models instead, specifically Embeddings from Language Models (ELMo) and Sentence-BERT.…”
Section: Item Content Representationmentioning
confidence: 99%
“…It can perform effectively no matter how many words are included in the input vector, but is constrained by the corpus to the vector space. By incorporating the generalizable contexts into the model, Word2Vec has been proven to be more accurate than other models [9,34,35].…”
Section: Recommendation Modelsmentioning
confidence: 99%