Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining 2018
DOI: 10.1145/3159652.3159668
|View full text |Cite
|
Sign up to set email alerts
|

Sequential Recommendation with User Memory Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
250
0
2

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 446 publications
(253 citation statements)
references
References 25 publications
1
250
0
2
Order By: Relevance
“…Therefore, another key challenge in SRSs is to learn sequential dependencies attentively and discriminatively over user-item interaction sequences with noise. Quite a few works have attempted to solve such a typical issue by employing the attention models [19] or memory networks [1] to selectively retain and utilize information from those interactions that are truly relevant to the next interaction prediction. The technical progress achieved in these solutions will be presented in Section 3.3.…”
Section: Handling User-item Interaction Sequences With Noisementioning
confidence: 99%
See 2 more Smart Citations
“…Therefore, another key challenge in SRSs is to learn sequential dependencies attentively and discriminatively over user-item interaction sequences with noise. Quite a few works have attempted to solve such a typical issue by employing the attention models [19] or memory networks [1] to selectively retain and utilize information from those interactions that are truly relevant to the next interaction prediction. The technical progress achieved in these solutions will be presented in Section 3.3.…”
Section: Handling User-item Interaction Sequences With Noisementioning
confidence: 99%
“…Besides the basic RNN structure, some variants are proposed to capture more complex dependencies in a sequence, like hierarchical RNN [13]. However, RNN is not flawless for SRSs, with the shortcomings in two aspects: (1) it is easy to generate fake dependencies due to the overly strong assumption that any adjacent interactions in a sequence must be dependent, which may not be the cases in the real world because there are usually irrelevant or noisy interactions inside a sequence; and (2) it is likely to capture the point-wise dependencies only while ignoring the collective dependencies (e.g., several interactions collaboratively affect the next one).…”
Section: Basic Deep Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Chen et al [3] introduces the memory mechanism to sequential recommender systems, which designs a user memory-augmented neural network (MANN) to express feature-level interests. As for more fine-grained user preference, Huang et al [8,9] use knowledge base information to enhance the semantic representation of key-value memory network called knowledge enhanced sequential recommender.…”
Section: Sequence-aware Recommendationmentioning
confidence: 99%
“…To reduce information overload and satisfy the diverse needs of users, personalized recommender systems come into being and play more and more important roles in modern society. These systems can provide personalized experiences, serve huge service demands, and bring significant benefits to at least two parties: (1) help users easily discover products that they are interested in; (2) create opportunities for product providers to increase the revenue.…”
Section: Introductionmentioning
confidence: 99%