2022
DOI: 10.1109/access.2022.3202637
|View full text |Cite|
|
Sign up to set email alerts
|

Retracted: A Self-Attention Mask Learning-Based Recommendation System

Abstract: The primary purpose of sequence modeling is to record long-term interdependence across interaction sequences, and since the number of items purchased by users gradually increases over time, this brings challenges to sequence modeling to a certain extent. Relationships between terms are often overlooked, and it is crucial to build sequential models that effectively capture long-term dependencies. Existing methods focus on extracting global sequential information, while ignoring deep representations from subsequ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
references
References 22 publications
(23 reference statements)
0
0
0
Order By: Relevance