Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval 2021
DOI: 10.1145/3404835.3463089
|View full text |Cite
|
Sign up to set email alerts
|

Sequential Recommendation for Cold-start Users with Meta Transitional Learning

Abstract: A fundamental challenge for sequential recommenders is to capture the sequential patterns of users toward modeling how users transit among items. In many practical scenarios, however, there are a great number of cold-start users with only minimal logged interactions. As a result, existing sequential recommendation models will lose their predictive power due to the difficulties in learning sequential patterns over users with only limited interactions. In this work, we aim to improve sequential recommendation fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 37 publications
(12 citation statements)
references
References 26 publications
0
12
0
Order By: Relevance
“…The historical sessions are completely different from the dialogue/conversation history used in previous CRS works [2,39,41], as their "historical" information is actually the historical turns of the current dialogue session. The usages of historical sessions in CRS are also largely different from the historical user-item interactions [1,31] in traditional recommendations, since the recommendation in CRS is strongly constrained by the current user intentions, while traditional recommendations are not. Thus, the main goal of historical sessions learner is: learning current-related and regular information from historical sessions, without impeding the current session information.…”
Section: Motivations and Discussion On Uccrmentioning
confidence: 99%
“…The historical sessions are completely different from the dialogue/conversation history used in previous CRS works [2,39,41], as their "historical" information is actually the historical turns of the current dialogue session. The usages of historical sessions in CRS are also largely different from the historical user-item interactions [1,31] in traditional recommendations, since the recommendation in CRS is strongly constrained by the current user intentions, while traditional recommendations are not. Thus, the main goal of historical sessions learner is: learning current-related and regular information from historical sessions, without impeding the current session information.…”
Section: Motivations and Discussion On Uccrmentioning
confidence: 99%
“…With the concept of meta-learning that aims to make learning faster, Vartak et al [2017]; Lee et al [2019]; Wei et al [2020]; Wang, Ding, and Caverlee [2021] adopted meta-learning techniques to alleviate the cold-start problem, where a model has not yet gathered sufficient information to draw any inferences for new users or new items. Liu et al [2020] applied meta-learning for session-based recommendation [Hidasi et al, 2016].…”
Section: Meta-learning In Recommender Systemsmentioning
confidence: 99%
“…Sequential models are partially aware of the dynamic in the user behaviors based on the change in input sequences, this includes RNN-based approaches: GRU4Rec [Hidasi et al, 2016] and NARM [Li et al, 2017], a Memory-Network-based approach: SUM [Lian et al, 2021], a SA-based model: SASRec [Kang and McAuley, 2018], and a meta-learning-based method: MetaTL [Wang, Ding, and Caverlee, 2021].…”
Section: Performance Comparison (Rq1)mentioning
confidence: 99%
“…Alleviating the issue mentioned above under the generative task is challenging because it requires more meaningful training data while recommender systems often facing cold-start and datasparsity issues [14,25,28,32]. S 3 -Rec [31] and CL4SRec [29] propose to randomly "reorder", "mask", and "crop" sequence as the augmented user behavior sequence to enrich the training set.…”
mentioning
confidence: 99%
“…ASReP [15] and BiCAT [8] propose to train unidirectional and bidirectional Transformers with reversed sequences to generate pseudo-prior items, respectively. Although they alleviate the user cold-start issue [25,32], the aforementioned issue can still presence, i.e., the generated pseudo-prior items via Transformer are most likely did not include sporting goods because the Transformer is also trained via NIP task. Besides, training via NIP often requires more iterations to learn the accurate user preference to items well, which suffers a substantial computation cost.…”
mentioning
confidence: 99%