2022
DOI: 10.48550/arxiv.2204.01839
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Coarse-to-Fine Sparse Sequential Recommendation

Abstract: Sequential recommendation aims to model dynamic user behavior from historical interactions. Self-attentive methods have proven effective at capturing short-term dynamics and long-term preferences. Despite their success, these approaches still struggle to model sparse data, on which they struggle to learn high-quality item representations. We propose to model user dynamics from shopping intents and interacted items simultaneously. The learned intents are coarse-grained and work as prior knowledge for item recom… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…Early work usually models the sequential dependencies with the Markov Chain assumption [24]. With the advance in deep learning, Recurrent Neural Networks (RNN) based [10,11,20,35], Convolutional Neural Networks (CNN) based [28], Graph Neural Networks (GNN) based [26,37,51], and attention based [13,16,27,36] models have been adopted to explore dynamic user interests that are hidden underneath behavior sequences. Recently, contrastive learning based models [19,23,41] are introduced to extract meaningful user patterns by deriving self-supervision signals.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Early work usually models the sequential dependencies with the Markov Chain assumption [24]. With the advance in deep learning, Recurrent Neural Networks (RNN) based [10,11,20,35], Convolutional Neural Networks (CNN) based [28], Graph Neural Networks (GNN) based [26,37,51], and attention based [13,16,27,36] models have been adopted to explore dynamic user interests that are hidden underneath behavior sequences. Recently, contrastive learning based models [19,23,41] are introduced to extract meaningful user patterns by deriving self-supervision signals.…”
Section: Related Workmentioning
confidence: 99%
“…), position-enhanced behavior-aware fusion (line 10-12), get attention scores (line[13][14], and attentive aggregation (line[16][17][18][19][20].…”
mentioning
confidence: 99%