Proceedings of the Web Conference 2020 2020
DOI: 10.1145/3366423.3380116
|View full text |Cite
|
Sign up to set email alerts
|

Future Data Helps Training: Modeling Future Contexts for Session-based Recommendation

Abstract: Long session-based recommender systems have attacted much attention recently. For each user, they may create hundreds of click behaviors in short time. To learn long session item dependencies, previous sequential recommendation models resort either to data augmentation or a left-to-right autoregressive training approach. While effective, an obvious drawback is that future user behaviors are always mising during training. In this paper, we claim that users' future action signals can be exploited to boost the re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
59
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 93 publications
(59 citation statements)
references
References 31 publications
0
59
0
Order By: Relevance
“…It randomly masks some tokens in a sequence, and then predicts them via the rest contexts. Some models [2,21,26,38,39] adopt the MLM task in sequencebased recommendation, indirectly considering the future behaviors after the current position via MLM in training. However, there are optimization biases between MLM and next-item prediction tasks.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…It randomly masks some tokens in a sequence, and then predicts them via the rest contexts. Some models [2,21,26,38,39] adopt the MLM task in sequencebased recommendation, indirectly considering the future behaviors after the current position via MLM in training. However, there are optimization biases between MLM and next-item prediction tasks.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, with the thriving of pre-training, some works introduce the masked language model (MLM) pre-training task in NLP [7] to sequence-based recommendation, which consider bidirectional information to learn better sequential models via selfsupervised learning (SSL) [26,40]. The MLM task randomly masks some items in the user behavior sequence during training, and then attempts to predict them with the remaining unmasked contexts [38,39]. These models can be viewed as indirectly using a specific future feature (i.e., behaviors after the masked item) under SSL in sequence-based recommendation.…”
mentioning
confidence: 99%
“…We used the adaptive fusing method for SLi-Rec, which shows the best performance in [28]. • GRec [29]: GRec is a state-of-the-art sequential recommender system without the time information, which leverages future data in a sequence as well for richer information in dilated convolutional neural networks. • TiSASRec [17]: TiSASRec is a state-of-the-art recommender system that utilizes the time information, which considers the time intervals between consecutive interactions in a user's interaction sequence in a self-attention network.…”
Section: Baselinesmentioning
confidence: 99%
“…• CSRM [37] utilizes memory networks to incorporate the neighbor sessions of the input session. • GRec [42] leverages future data in a session as well when learning the preference of the session for richer information in dilated convolutional neural networks. • GCE-GNN [38] is the state-of-the-art SRS that constructs a global graph that models pairwise item-transitions over all sessions as well as the session graphs.…”
Section: Baselinesmentioning
confidence: 99%