2018 IEEE International Conference on Data Mining (ICDM) 2018
DOI: 10.1109/icdm.2018.00035
|View full text |Cite
|
Sign up to set email alerts
|

Self-Attentive Sequential Recommendation

Abstract: Sequential dynamics are a key feature of many modern recommender systems, which seek to capture the 'context' of users' activities on the basis of actions they have performed recently. To capture such patterns, two approaches have proliferated: Markov Chains (MCs) and Recurrent Neural Networks (RNNs). Markov Chains assume that a user's next action can be predicted on the basis of just their last (or last few) actions, while RNNs in principle allow for longer-term semantics to be uncovered. Generally speaking, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

11
1,578
1
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 1,758 publications
(1,712 citation statements)
references
References 35 publications
11
1,578
1
1
Order By: Relevance
“…We test the model with separate embedding space for input and output sequence representations; and the model performance drops by a large margin. Prior work, e.g., [19], in sequential recommendation found similar observations. Even though the separate embedding space is a popular choice in neural language models [30], but the item corpus appears to be more sparse to afford two distinct embedding spaces.…”
Section: Performance Analysissupporting
confidence: 66%
See 1 more Smart Citation
“…We test the model with separate embedding space for input and output sequence representations; and the model performance drops by a large margin. Prior work, e.g., [19], in sequential recommendation found similar observations. Even though the separate embedding space is a popular choice in neural language models [30], but the item corpus appears to be more sparse to afford two distinct embedding spaces.…”
Section: Performance Analysissupporting
confidence: 66%
“…• Self-attentive Sequential Recommendation (SASRec). Kang and McAuley [19] applied the self-attention based model on sequential recommendation. It uses the last encoder's layer hidden state for the last input to predict the next item for user.…”
Section: • First Order Markov Model (Markov) This Methods Makes Thementioning
confidence: 99%
“…The dataset includes 6,040 users and 3,416 items, with a sparsity of 94.44%. As in [8,10], we treat all ratings as observed implicit feedback instances, and sort the feedback according to timestamps. For each user, we withhold their last two actions, and put them into validation set and test set respectively.…”
Section: Experiments 31 Datasetsmentioning
confidence: 99%
“…As quantized embedding is a generic method that can directly replace the embedding layer in existing gradient descent based recommendation models, we include three representative recommendation models as the backbone models to test our hypothesis: Generalized Matrix Factorization (GMF) [9] extends the conventional matrix factorization by introducing a learned linear layer for weighting latent dimensions; Neural Matrix Factorization (NeuMF) [9] models non-linear interactions between user and item embeddings via multi-layer perceptrons (MLP); Self-Attentive Sequential Recommendation (SASRec) [10] is the state-of-the-art method on the sequential recommendation task, which adopts multiple self-attention blocks to capture sequential dynamics in users' action history, and predicts the next item at each time step. The embedding dimensionality d is set to 64 for all methods.…”
Section: Backbone Recommendation Modelsmentioning
confidence: 99%
“…For example, DeepCoNN [27] uses convolutional neural networks to process reviews, and utilizes deep learning technology to jointly model user and item from textual reviews. Recently, some works have incorporated attention mechanism into recommender systems [3]- [5], [11], [14], [20]. However, the existing works based on deep learning often only focus on the latent feature learning for users and items, but ignore the explainability of recommendations.…”
Section: B Deep Learning For Recommendationmentioning
confidence: 99%