2019
DOI: 10.5755/j01.itc.48.4.23149
|View full text |Cite
|
Sign up to set email alerts
|

AANMF: Attribute-Aware Attentional Neural Matrix Factorization

Abstract: Matrix Factorization (MF) is one of the most intuitive and effective methods in the Recommendation System domain. It projects sparse (user, item) interactions into dense feature products which endues strong generality to the MF model. To leverage this interaction, recent works use auxiliary information of users and items. Despite effectiveness, irrationality still exists among these methods, since almost all of them simply add the feature of auxiliary information in dense latent space to the feature of the use… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 27 publications
0
1
0
Order By: Relevance
“…ACAM [27] learned associations between different attributes through a coattention module as a way to improve recommendation performance. AANMF [28] employed the attention mechanism in learning the weights between different attributes,and obtained the user-item matching degree through the neural matrix decomposition network. Considering that the attributes of users and items also have complex interactions with each other, FG-RS [29] adaptively learned the interactions between different attributes through self-attention networks.…”
Section: Related Workmentioning
confidence: 99%
“…ACAM [27] learned associations between different attributes through a coattention module as a way to improve recommendation performance. AANMF [28] employed the attention mechanism in learning the weights between different attributes,and obtained the user-item matching degree through the neural matrix decomposition network. Considering that the attributes of users and items also have complex interactions with each other, FG-RS [29] adaptively learned the interactions between different attributes through self-attention networks.…”
Section: Related Workmentioning
confidence: 99%