Proceedings of the 16th ACM Conference on Recommender Systems 2022
DOI: 10.1145/3523227.3546788
|View full text |Cite
|
Sign up to set email alerts
|

Denoising Self-Attentive Sequential Recommendation

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
15
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3
2

Relationship

2
6

Authors

Journals

citations
Cited by 39 publications
(15 citation statements)
references
References 29 publications
0
15
0
Order By: Relevance
“…To optimize the GRU model, we applied a similar approach as for the LSTM model, wherein we replaced the LSTM cells in the RNN architecture with GRU cells. • Transformer (TF): The transformer is an alternative to the RNNs for sequence modeling [5,19,20,35,45]. To learn the hidden representation for the input time series, we utilized the transformer encoder proposed in [35].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To optimize the GRU model, we applied a similar approach as for the LSTM model, wherein we replaced the LSTM cells in the RNN architecture with GRU cells. • Transformer (TF): The transformer is an alternative to the RNNs for sequence modeling [5,19,20,35,45]. To learn the hidden representation for the input time series, we utilized the transformer encoder proposed in [35].…”
Section: Methodsmentioning
confidence: 99%
“…Another family of methods that can be used in our problem is neural networks. For example, long short-term memory networks [15,17,18,20,45], gated recurrent unit networks [7,20,45], transformers [5,19,20,35,45], and convolutional neural networks [16,29,39] have shown effectiveness in tasks such as time series classification, forecasting, and anomaly detection. Thus, we also include these neural network models in our experiments.…”
Section: Related Workmentioning
confidence: 99%
“…For the Amazon product review dataset, we followed a similar approach as described in [6,22,27]. For each user, we included the last 10 reviews in the test dataset.…”
Section: Methodsmentioning
confidence: 99%
“…TLSRec [7] simultaneously models the global stability and local fluctuation of a user's preference with a hierarchical attention network. Rec-Denosier [6] adaptively eliminates the noisy items during the training process to remove irrelevant information in a user's behavior sequence. Transformers4Rec [10] performs an empirical analysis with broad experiments of various transformer architectures for the task of sequential recommendation.…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, numerous deep learning-based recommender system models have been proposed, as documented in [21]- [24]. A particular strand of research aims to substitute the conventional inner-product operation found in Matrix Factorization (MF) models with deep neural networks.…”
Section: B Deep Learning For Collaborative Filteringmentioning
confidence: 99%