Proceedings of the 2020 SIAM International Conference on Data Mining 2020
DOI: 10.1137/1.9781611976236.11
|View full text |Cite
|
Sign up to set email alerts
|

Multiplex Memory Network for Collaborative Filtering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 17 publications
0
9
0
Order By: Relevance
“…In addition, most deep recommendation models have less focus on explicitly modeling user-user or item-item high-order semantic correlations, while such relations could provide valuable information to inference user or item features. Although some existing works [5,15] utilize the co-occurrence relation (co-engage between users or co-engaged between items) to define the neighbors for users and items, we argue that such co-occurrence relation is macrolevel and coarse-grained. For instance, Figure 1(a) shows a simple user-item interaction in the movie domain.…”
Section: Introductionmentioning
confidence: 90%
See 3 more Smart Citations
“…In addition, most deep recommendation models have less focus on explicitly modeling user-user or item-item high-order semantic correlations, while such relations could provide valuable information to inference user or item features. Although some existing works [5,15] utilize the co-occurrence relation (co-engage between users or co-engaged between items) to define the neighbors for users and items, we argue that such co-occurrence relation is macrolevel and coarse-grained. For instance, Figure 1(a) shows a simple user-item interaction in the movie domain.…”
Section: Introductionmentioning
confidence: 90%
“…Note that it only focuses on the user's neighbors without accounting for the information about similar items. • MMCF is a memory-based model, which models user-user and item-item co-occurrence contexts by memory networks [15]. Different from our methods, it only focuses on co-occurrence relations and ignores high-order semantic transitive relations.…”
Section: Experiments 41 Experiments Setupmentioning
confidence: 99%
See 2 more Smart Citations
“…Besides, to make full use of the heterogeneous side information, we use the attention network to learn the importance of different types of heterogeneous side information. Some studies have shown that different types of heterogeneous side information make different contributions to the learning of user and item representations [11], [24]. Discriminatively treating the heterogeneous side information helps ILIG comprehend the user and item accurately.…”
Section: Introductionmentioning
confidence: 99%