2019
DOI: 10.1109/access.2019.2892565
|View full text |Cite
|
Sign up to set email alerts
|

HARSAM: A Hybrid Model for Recommendation Supported by Self-Attention Mechanism

Abstract: Collaborative filtering is one of the most commonly used methods in recommendation systems. However, the sparsity of the rating matrix, cold start-up, and most recommendation algorithms only consider the users while neglecting the relationship between the products, all of what limit the effectiveness of the recommendation algorithms. In this paper, based on the self-attention mechanism, a deep learning model, named HARSAM, is proposed for modeling user interaction data and learning the user's latent preference… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…Yu et al [33] introduced an attention mechanism to filter user behavior sequences when modeling the short-term preferences of users and adaptively fused the long-term and short-term preferences of users based on the attention framework. Peng et al [34] proposed a hybrid model, HARSAM, which combines the attention mechanism with a deep neural network and uses the self-attention mechanism to extract the correlation between items in different time intervals. To extract more hidden information from the sparse check-in data of users, Pang et al [35] used a hierarchical attention mechanism with a "local-global" structure to mine the contribution of different features for recommendations and used this information with more contributions to make recommendations.…”
Section: Recommendations Integrating Attention Mechanismmentioning
confidence: 99%
“…Yu et al [33] introduced an attention mechanism to filter user behavior sequences when modeling the short-term preferences of users and adaptively fused the long-term and short-term preferences of users based on the attention framework. Peng et al [34] proposed a hybrid model, HARSAM, which combines the attention mechanism with a deep neural network and uses the self-attention mechanism to extract the correlation between items in different time intervals. To extract more hidden information from the sparse check-in data of users, Pang et al [35] used a hierarchical attention mechanism with a "local-global" structure to mine the contribution of different features for recommendations and used this information with more contributions to make recommendations.…”
Section: Recommendations Integrating Attention Mechanismmentioning
confidence: 99%
“…After applying expressions (12) and (13) in (8) and (9) we get the F-SGD weight update rules for user features and items features vectors as:…”
Section: A Fractional Sgdmentioning
confidence: 99%
“…There are different types of recommender systems based on different methods [8]- [13] such as collaborative filtering (CF), content based filtering (CB), demographic, knowledgebased, community-based and hybrid recommender systems. Widely applied techniques among those are CF [14]- [18] and CB [19]- [21].…”
Section: Introductionmentioning
confidence: 99%
“…With the remarkable results Transformer has achieved, there are more researches based on the self-attention mechanism. It has been used successfully in a variety of tasks to capture the global dependencies among each input sequence, such as speaker identification, relation extraction, popularity prediction, deep face recognition and recommendation systems [22]- [28]. We are the first to introduce the self-attention mechanism to the simile recognition task.…”
Section: B Self-attention Mechanismmentioning
confidence: 99%