Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/513
|View full text |Cite
|
Sign up to set email alerts
|

DMRAN:A Hierarchical Fine-Grained Attention-Based Network for Recommendation

Abstract: The conventional methods for the next-item recommendation are generally based on RNN or one- dimensional attention with time encoding. They are either hard to preserve the long-term dependencies between different interactions, or hard to capture fine-grained user preferences. In this paper, we propose a Double Most Relevant Attention Network (DMRAN) that contains two layers, i.e., Item level Attention and Feature Level Self- attention, which are to pick out the most relevant items from the sequence of user’s h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(8 citation statements)
references
References 7 publications
0
8
0
Order By: Relevance
“…The later infers all possible sequences of user choices over all items, which may suffer from intractable computation problem for real-world applications where the number of items is large. Recently, many deep learning based approaches are proposed for the task, which make use of pairwise item-transition information to model the user preference of a given session [2,4,6,18,19,21]. These approaches have achieved encouraging results, but they still face the following issues.…”
Section: Introductionmentioning
confidence: 99%
“…The later infers all possible sequences of user choices over all items, which may suffer from intractable computation problem for real-world applications where the number of items is large. Recently, many deep learning based approaches are proposed for the task, which make use of pairwise item-transition information to model the user preference of a given session [2,4,6,18,19,21]. These approaches have achieved encouraging results, but they still face the following issues.…”
Section: Introductionmentioning
confidence: 99%
“…Cong et al [42] distinguish the importance of reviews at both word level and sentence level using a hierarchical attention-based network for e-commerce recommendation. Similarly, Wang et al [43] proposed a twolevel attention mechanism with time encoding and achieved good performance in the next-item recommendation.…”
Section: Related Workmentioning
confidence: 99%
“…With the rapid development of deep learning technology, many efforts have been devoted to obtaining personalized user interests in the field of recommendation [12,17,19,21,35,36,38,42,43], especially the combination of recurrent neural networks (RNN) [8,22] and attention mechanisms [4,30], which has made great progress in capturing long-term and short-term user preferences and learning diverse interests. For example, Chen et al [2] proposed a hierarchical attention network at category-level and item-level for long-term and short-term interest modeling in micro-video click-through prediction.…”
Section: Introductionmentioning
confidence: 99%
“…• Multi-scale time effects. Previous methods usually consider that the effect of micro-videos on user interest modeling decreases over time implicitly, which is captured by RNN [8,10,11] or learned from timestamp features [20,30]. However, they ignore the case that the importance of micro-videos decreases over time varies from user to user, that is to say, for different users, 1 https://www.jiguang.cn/reports/43.…”
Section: Introductionmentioning
confidence: 99%