Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence 2022
DOI: 10.24963/ijcai.2022/333
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing Sequential Recommendation with Graph Contrastive Learning

Abstract: Learning useful interactions between input features is crucial for tabular data modeling. Recent efforts start to explicitly model the feature interactions with graph, where each feature is treated as an individual node. However, the existing graph construction methods either heuristically formulate a fixed feature-interaction graph based on specific domain knowledge, or simply apply attention function to compute the pairwise feature similarities for each sample. While the fixed graph may be sub-optimal to d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(7 citation statements)
references
References 1 publication
0
7
0
Order By: Relevance
“…For instance, [39] introduces a graph-based contrastive learning framework that exploits review information to enhance the user-item interaction graph for improving recommendation performance. Moreover, [40] also shows that GNN can help improve the performance of sequential recommendation models. For example, [41] uses GNN to model users' session-based behavior sequences.…”
Section: Graph-based Recommendation Methodsmentioning
confidence: 96%
“…For instance, [39] introduces a graph-based contrastive learning framework that exploits review information to enhance the user-item interaction graph for improving recommendation performance. Moreover, [40] also shows that GNN can help improve the performance of sequential recommendation models. For example, [41] uses GNN to model users' session-based behavior sequences.…”
Section: Graph-based Recommendation Methodsmentioning
confidence: 96%
“…For example, SGL (Wu et al 2021) improved recommendation models by random nodes and edges dropout to create contrastive views. GCL4SR (Zhang et al 2022) first performed graph contrastive learning on SR by random neighbor sampling on item transition graphs to get stable item representations in different subgraphs. LightGCL (Cai et al 2023) applied singular value decomposition as graph augmentation to enhance global collaborative relation learning process.…”
Section: Contrastive Learning For Recommendationmentioning
confidence: 99%
“…where P u ∈ R 1Γ—n is the interest matrix of user u, W T ∈ R 3nΓ—n is a trainable weight matrix. AttNet(β€’) denotes the attention network employed in (Zhang et al 2022). Then the following formulation calculates the possibility that the user u would interact with the expected item v at the (|S u |+1)-th step according to S u :…”
Section: Prediction Layermentioning
confidence: 99%
“…In contrast to the commonly used user-item graph, to make use of global information in behavior sequences, we propose to mine hard negatives on a WITG G(I, E) [27] where E is the edge set. A WITG contains global item transition patterns extracted from all user behavior sequences in the training set D. G is constructed by traversing every sequence in D. For a sequence s ∈ D, if there exists no edge between the items 𝑖 π‘š and 𝑖 (π‘š+π‘˜ ) in G, we connect them and set the edge weight 𝑀 (𝑖 π‘š , 𝑖 (π‘š+π‘˜ ) ) = 1/π‘˜, where π‘˜ represents the importance of a target item 𝑖 π‘š to its π‘˜-hop neighbor 𝑖 (π‘š+π‘˜ ) in s. Otherwise, if there is already an edge between them, we update the edge weight as 𝑀 (𝑖 π‘š , 𝑖 (π‘š+π‘˜ ) ) ← 𝑀 (𝑖 π‘š , 𝑖 (π‘š+π‘˜ ) ) + 1/π‘˜.…”
Section: Weighted Item Transition Graph (Witg)mentioning
confidence: 99%