2019
DOI: 10.32604/cmc.2019.06104
|View full text |Cite
|
Sign up to set email alerts
|

Hashtag Recommendation Using LSTM Networks with Self-Attention

Abstract: On Twitter, people often use hashtags to mark the subject of a tweet. Tweets have specific themes or content that are easy for people to manage. With the increase in the number of tweets, how to automatically recommend hashtags for tweets has received wide attention. The previous hashtag recommendation methods were to convert the task into a multi-class classification problem. However, these methods can only recommend hashtags that appeared in historical information, and cannot recommend the new ones. In this … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 14 publications
0
9
0
Order By: Relevance
“…GA-ConvGRU consists of two opposite learning systems, namely a generator and a discriminator. Lin uses the ConvGRU model as a generator and attaches a five-layer convolutional neural network as a discriminator [21]. These two systems learn by playing the minimax game.…”
Section: Ga-convgrumentioning
confidence: 99%
“…GA-ConvGRU consists of two opposite learning systems, namely a generator and a discriminator. Lin uses the ConvGRU model as a generator and attaches a five-layer convolutional neural network as a discriminator [21]. These two systems learn by playing the minimax game.…”
Section: Ga-convgrumentioning
confidence: 99%
“…The result obtained by using the regression model and the prediction method is very small, and there is a decline in some indicators. After studying the literature related to this article, add relevant theoretical and experimental analysis [20][21][22][23][24] .Comprehensive consideration, the use of MBPR performs better.…”
Section: Network Element Optimization Experiments Analysismentioning
confidence: 99%
“…Moreover taking into account the demesne facts, deep learning models learn highlights and evolution straightforwardly from the data 15–17 . They accelerate the mode of data preparation and can study more intricate data patterns in a replete manner 18 . These methods have been appeared to deliver more exact outcomes than conventional regression‐based modeling.…”
Section: Introductionmentioning
confidence: 99%
“…[15][16][17] They accelerate the mode of data preparation and can study more intricate data patterns in a replete manner. 18 These methods have been appeared to deliver more exact outcomes than conventional regression-based modeling. It is accounted for that counterfeit recurrent neural networks (RNNs) with memory, for example, long short-term memory (LSTM) and gated recurrent unit (GRU), are better analyzed than autoregressive integrated moving average (ARIMA) with a huge edge.…”
mentioning
confidence: 99%