The World Wide Web Conference 2019
DOI: 10.1145/3308558.3313488
|View full text |Cite
|
Sign up to set email alerts
|

Graph Neural Networks for Social Recommendation

Abstract: In recent years, Graph Neural Networks (GNNs), which can naturally integrate node information and topological structure, have been demonstrated to be powerful in learning on graph data. These advantages of GNNs provide great potential to advance social recommendation since data in social recommender systems can be represented as user-user social graph and user-item graph; and learning latent factors of users and items is the key. However, building social recommender systems based on GNNs faces challenges. For … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
776
0
3

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 1,477 publications
(780 citation statements)
references
References 32 publications
1
776
0
3
Order By: Relevance
“…These methods aim to represent each object in the user-item interaction network with a low-dimensional continuous embedding vector z ∈ R d , expecting that similar users or items have similar embeddings. Among them, graph neural networks (GNNs) [16,24], as a special instantiation of neural networks for structured data, have achieved state-of-the-art performance in information retrieval [10]. Despite their retrieval quality, the computational costs to filter such a number of candidates in continuous embedding space are expensive, due to the inference inefficiency with O(Nd) computational complexity for linear search in the worse case, where N is the total number of objects in corpus.…”
Section: Introductionmentioning
confidence: 99%
“…These methods aim to represent each object in the user-item interaction network with a low-dimensional continuous embedding vector z ∈ R d , expecting that similar users or items have similar embeddings. Among them, graph neural networks (GNNs) [16,24], as a special instantiation of neural networks for structured data, have achieved state-of-the-art performance in information retrieval [10]. Despite their retrieval quality, the computational costs to filter such a number of candidates in continuous embedding space are expensive, due to the inference inefficiency with O(Nd) computational complexity for linear search in the worse case, where N is the total number of objects in corpus.…”
Section: Introductionmentioning
confidence: 99%
“…Table 2 and Fig. 3 present the recommendation performance of all the methods on the whole dataset and on the cold-start user set respectively, and we can make the following observations: 5 The implementation of RSGAN: https://github.com/Coder-Yu/RecQ 1) In all the cases in Table 2, RSGAN outperforms its counterparts on both the relevance and ranking metrics. Specifically, RSGAN achieves good results in terms of the item ranking and the largest relative improvement (calculated by comparing with the second best performance) is more than 11%.…”
Section: B Recommendation Performancementioning
confidence: 88%
“…The instance in this section just sets an example for other applications. Finally, there is an for each user u do 5 Input the seeded friends S u into the generator; 6 Generates p u over the whole user set; 7 Feed p u into the first Gumbel-Softmax layer; 8 Get a one-hot vector representing the friend v; Look-up operation for the items consumed by this friend;…”
Section: E Adversarial Trainingmentioning
confidence: 99%
“…Finally, the learned user features are integrated into the probabilistic matrix factorization for rating prediction. Base on this, they further proposed a novel attention neural network (GraphRec) (Fan et al 2019) to jointly model the user-item interactions and user social relations. First, the user embedding is learned via aggregating both (1) the user's opinions towards interacted items with different attention scores and (2) the influence of her social friends with different attention scores.…”
Section: Deep Learning Models With Network Features (Dlms+nfs)mentioning
confidence: 99%