2022
DOI: 10.1007/978-3-031-00126-0_15
|View full text |Cite
|
Sign up to set email alerts
|

Diffusion-Based Graph Contrastive Learning for Recommendation with Implicit Feedback

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 17 publications
0
9
0
Order By: Relevance
“…For example, GRU4Rec [Hidasi et al, 2016] treats users' behavior sequences as time series data and uses a multi-layer GRU structure to capture the sequential patterns. Moreover, some works, e.g., NARM [Li et al, 2017] and DREAM [Yu et al, 2016], combine attention mechanisms with GRU structures to learn users' dynamic representations. Simultaneously, Convolutional Neural Networks (CNN) have also been explored for sequential recommendation.…”
Section: Sequential Recommendationmentioning
confidence: 99%
See 3 more Smart Citations
“…For example, GRU4Rec [Hidasi et al, 2016] treats users' behavior sequences as time series data and uses a multi-layer GRU structure to capture the sequential patterns. Moreover, some works, e.g., NARM [Li et al, 2017] and DREAM [Yu et al, 2016], combine attention mechanisms with GRU structures to learn users' dynamic representations. Simultaneously, Convolutional Neural Networks (CNN) have also been explored for sequential recommendation.…”
Section: Sequential Recommendationmentioning
confidence: 99%
“…For instance, SR-GNN converts sequences to graph structure data and employs the gated graph neural network to perform information propagation on the graph. GC-SAN [Xu et al, 2019] dynamically builds a graph for each sequence and models the local dependencies and long-range dependencies between items by combining GNN and self-attention mechanism. In addition, GCE-GNN builds the global graph and local graph to model global item transition patterns and local item transition patterns, respectively.…”
Section: Sequential Recommendationmentioning
confidence: 99%
See 2 more Smart Citations
“…• GC-SAN [Xu et al, 2019]: This method utilizes graph neural network and self-attention mechanism to dynamically capture rich local dependencies.…”
Section: Experimental Settingsmentioning
confidence: 99%