2021
DOI: 10.1016/j.knosys.2021.107094
|View full text |Cite
|
Sign up to set email alerts
|

History-based attention in Seq2Seq model for multi-label text classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 37 publications
(11 citation statements)
references
References 57 publications
0
11
0
Order By: Relevance
“…Deep learning methods have shown great potential for acquiring language and image analysis from text libraries and text vector within neural networks, and are widely used to analyze audio and video, according to [ 12 ]. In [ 13 , 14 ], CNN is prominent among various types of neural networks and is a vital technology enabling many deep learning applications.” As shown in [ 15 , 38 ], CNN has effectively categorized news data, assessed attitudes in tweets and movie reviews, and performed other tasks. The author suggests a multilabel technique built on layered aggregation using several methods in [ 16 ].…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…Deep learning methods have shown great potential for acquiring language and image analysis from text libraries and text vector within neural networks, and are widely used to analyze audio and video, according to [ 12 ]. In [ 13 , 14 ], CNN is prominent among various types of neural networks and is a vital technology enabling many deep learning applications.” As shown in [ 15 , 38 ], CNN has effectively categorized news data, assessed attitudes in tweets and movie reviews, and performed other tasks. The author suggests a multilabel technique built on layered aggregation using several methods in [ 16 ].…”
Section: Literature Reviewmentioning
confidence: 99%
“…When it comes to determining which label collection belongs to a certain research instance, neural models are essential. Moreover, [ 38 ] introduces a Neural Network (NN) model as an additional machine learning technique. To improve the NN’s applicability for machine classification problems, modifications to a fluid, adaptive resonance map are proposed.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In recent years, transformer-style models achieved many good results in various tasks. Subsequently, attention mechanisms have become more common and are widely used in classifcation tasks, such as sentiment classifcation [29], musical instrument recognition [30], visual recommender systems [31], multilabel text classifcation [32], and multiple protein subcellular location prediction [33].…”
Section: Attention Mechanismmentioning
confidence: 99%
“…Multilabel classification problems are common in various fields, such as disease diagnosis, where a patient can have multiple conditions like heart disease and diabetes. Multilabel classification has been widely used in text recognition (Xiao et al , 2021), video classification (Karagoz et al , 2021), clinical detection and other fields (Blanco et al , 2020).…”
Section: Related Background Theorymentioning
confidence: 99%