The 41st International ACM SIGIR Conference on Research &Amp; Development in Information Retrieval 2018
DOI: 10.1145/3209978.3210023
|View full text |Cite
|
Sign up to set email alerts
|

Attentive Recurrent Social Recommendation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
57
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
4
2

Relationship

1
9

Authors

Journals

citations
Cited by 115 publications
(57 citation statements)
references
References 37 publications
0
57
0
Order By: Relevance
“…MF models user preferences and item properties by factorizing the user-item interaction matrix into two low-dimensional latent matrices. Recently, numerous deep learning techniques (e.g., MLP [21], CNNs [22], RNNs [23], GNNs [24], Autoencoder [25], and the Attention Mechanism [26]) have been introduced into RS. Compared to traditional RS, deep learning based RS is able to model the nonlinearity of data correlations and learn the underlying complex feature representations [18].…”
Section: Related Work 21 Recommender Systems (Rs)mentioning
confidence: 99%
“…MF models user preferences and item properties by factorizing the user-item interaction matrix into two low-dimensional latent matrices. Recently, numerous deep learning techniques (e.g., MLP [21], CNNs [22], RNNs [23], GNNs [24], Autoencoder [25], and the Attention Mechanism [26]) have been introduced into RS. Compared to traditional RS, deep learning based RS is able to model the nonlinearity of data correlations and learn the underlying complex feature representations [18].…”
Section: Related Work 21 Recommender Systems (Rs)mentioning
confidence: 99%
“…Attention mechanism is widely used in many neural network based tasks, such as machine translation [4] and image captioning [53]. Recently, the attention mechanism is also widely used for recommender systems [19], [52], [43], [41]. Given the classical collaborative filtering scenario with user-item interaction behavior, NAIS extended the classical item based recommendation models by distinguishing the importance of different historical items in a user profile [19].…”
Section: Related Workmentioning
confidence: 99%
“…For example, the dualflow attention mechanism [21] is used to perform feature refinement modeling on both dynamic and static aspects. The Transformer can even replace the RNN for feature capture [41]. High-order attention is the new attention-grabbing mechanism proposed by Chen et al [42] , which is also a new method for high-order expression in fine-grained image features.…”
Section: Related Workmentioning
confidence: 99%