2020
DOI: 10.1109/tcss.2020.2986778
|View full text |Cite
|
Sign up to set email alerts
|

AMNN: Attention-Based Multimodal Neural Network Model for Hashtag Recommendation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(17 citation statements)
references
References 43 publications
0
14
0
Order By: Relevance
“…These solutions use pattern mining for hashtag recommendation [3,13,22]. Other solutions explore the deep learning architectures to learn the diferent patterns, and behaviors from the collection of tweets, and then use inference to recommend the hashtags of the new tweet [7,24,35,44,46]. From our analysis of the state-of-the-art hashtag recommendation solutions, we can say that the pattern mining solutions require high computational time compared to the deep learning solutions.…”
Section: Motivationsmentioning
confidence: 99%
“…These solutions use pattern mining for hashtag recommendation [3,13,22]. Other solutions explore the deep learning architectures to learn the diferent patterns, and behaviors from the collection of tweets, and then use inference to recommend the hashtags of the new tweet [7,24,35,44,46]. From our analysis of the state-of-the-art hashtag recommendation solutions, we can say that the pattern mining solutions require high computational time compared to the deep learning solutions.…”
Section: Motivationsmentioning
confidence: 99%
“…A recent line of work on tag recommendation takes a generative approach (Wang et al, 2019;Yang et al, 2020b). However, their application of GRU shows that tags are treated as an ordered sequence, neglecting the orderless yet interrelated characteristics of tags.…”
Section: Related Workmentioning
confidence: 99%
“…AR (1-to-1 vs. 1-to-M): A generalized version of the autoregressive (AR) tag generation models (Wang et al, 2019;Yang et al, 2020b). We employ the Transformer (Vaswani et al, 2017) encoderdecoder architecture, but replace the encoder with BERT for fair comparison with our SOG.…”
Section: Baselinesmentioning
confidence: 99%
See 1 more Smart Citation
“…In recent years, various methods recommend relevant hashtags to a given microblog by using statistical approaches [1], [2] and neural network approaches [3]- [9] based on textual content. Nevertheless, learning only content in microblogs lacks personalization since the same content from different users might have different meanings depending on (a) Example of multiple relations (user-hashtag interaction, user-user social, and hashtag-hashtag co-occurrence) and characteristics that reflect user preferences and hashtag attributes.…”
Section: Introductionmentioning
confidence: 99%