Proceedings of the Tenth ACM International Conference on Web Search and Data Mining 2017
DOI: 10.1145/3018661.3018689
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Recommender Networks

Abstract: Recommender systems traditionally assume that user profiles and movie attributes are static. Temporal dynamics are purely reactive, that is, they are inferred after they are observed, e.g. after a user's taste has changed or based on handengineered temporal bias corrections for movies. We propose Recurrent Recommender Networks (RRN) that are able to predict future behavioral trajectories. This is achieved by endowing both users and movies with a Long Short-Term Memory (LSTM) [14] autoregressive model that capt… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
250
0
4

Year Published

2017
2017
2022
2022

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 544 publications
(254 citation statements)
references
References 14 publications
0
250
0
4
Order By: Relevance
“…[48] took one-hot encoding of items in users' behavior sequences as input, converted them into dense vectors via lookup operation, which are sequentially fed into GRUbased RNN to learn users' historical embedding. Finally, the model projects user u's historical embedding h, user embedding, and target item i's embedding to predict user u's probability in choosing item i. RRN [49] is the first recurrent recommender network that attempts to capture the dynamics of both user and item representation, where each individual recurrent network is adopted to address the temporal evolution of each user and item respectively. To capture stationary attributes, it uses an additional set of auxiliary parameters for users and items respectively.…”
Section: Rnn-based Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…[48] took one-hot encoding of items in users' behavior sequences as input, converted them into dense vectors via lookup operation, which are sequentially fed into GRUbased RNN to learn users' historical embedding. Finally, the model projects user u's historical embedding h, user embedding, and target item i's embedding to predict user u's probability in choosing item i. RRN [49] is the first recurrent recommender network that attempts to capture the dynamics of both user and item representation, where each individual recurrent network is adopted to address the temporal evolution of each user and item respectively. To capture stationary attributes, it uses an additional set of auxiliary parameters for users and items respectively.…”
Section: Rnn-based Modelsmentioning
confidence: 99%
“…User recurrent models. They treat both user and item representations as recurrent components in DL-based models, which can better capture users' evolving preferences, including memory-augmented neural network [51], RNNbased models [52], [53], [74] and recurrent neural networks [49], [50]. For example, [52], [53], [74] use RNN framework to learn users' long-term interest from their historical behavior sequences, and experiments verify that considering a user's long-term interest is critically valuable for personalized recommendation, e.g., HRNN [52] exceeds GRU4Rec by 3.5% with user representation.…”
Section: Adding Explicit User Representationmentioning
confidence: 99%
“…files and reviews [3], [5]. In particular, deep learning models have been widely studied [13], [15]. AutoRec first proposed the use of autoencoders for recommender systems [17].…”
Section: Figmentioning
confidence: 99%
“…Uma Rede Neural Recorrente (RNN)é uma classe de redes neurais que exploram a natureza sequencial da sua entrada [Wu et al 2017]. As RNNs têm sido aplicadas em várias tarefas como: reconhecimento de dígitos [Graves and Schmidhuber 2009], geração de texto [Sutskever et al 2011], criação de modelos de língua [Mikolov 2012, Graves 2013, Pascanu et al 2013, reconhecimento de voz [Graves et al 2013], tradução automática [Bahdanau et al 2014, Sutskever et al 2014] e inferência textual [Bowman et al 2015, Rocktäschel et al 2015.…”
Section: Rede Neural Recorrente E a Variante Long Short Term Memoryunclassified