Proceedings of the 17th ACM Conference on Recommender Systems 2023
DOI: 10.1145/3604915.3610644
|View full text |Cite
|
Sign up to set email alerts
|

Turning Dross Into Gold Loss: is BERT4Rec really better than SASRec?

Anton Klenitskiy,
Alexey Vasilev

Abstract: Recently sequential recommendations and next-item prediction task has become increasingly popular in the field of recommender systems. Currently, two state-of-the-art baselines are Transformerbased models SASRec and BERT4Rec. Over the past few years, there have been quite a few publications comparing these two algorithms and proposing new state-of-the-art models. In most of the publications, BERT4Rec achieves better performance than SASRec. But BERT4Rec uses cross-entropy over softmax for all items, while SASR… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 15 publications
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…Examples include SASRec [6] and BERT4Rec [9] demonstrating state-of-the-art performance. The effectiveness of these attention-based architectures in modeling sequential data sparked many further improvements and adaptations aimed at enhancing recommendations quality [7,8,13]. Notably, the usage of crossentropy loss for SASRec architecture (SASRec+) [7] has demonstrated superior performance compared to other techniques.…”
Section: Related Workmentioning
confidence: 99%
“…Examples include SASRec [6] and BERT4Rec [9] demonstrating state-of-the-art performance. The effectiveness of these attention-based architectures in modeling sequential data sparked many further improvements and adaptations aimed at enhancing recommendations quality [7,8,13]. Notably, the usage of crossentropy loss for SASRec architecture (SASRec+) [7] has demonstrated superior performance compared to other techniques.…”
Section: Related Workmentioning
confidence: 99%