2020
DOI: 10.1109/access.2020.3041763
|View full text |Cite
|
Sign up to set email alerts
|

Discriminative Adaptive Sets for Multi-Label Classification

Abstract: Multi-label classification aims to associate multiple labels to a given data/object instance to better describe them. Multi-label data sets are common in a lot of emerging application areas like: Text/Multimedia classification, Bio-Informatics, Medical image annotations and Computer Vision to name a few. There is a growing interest in efficient and accurate multi-label classification. There are two major approaches to perform multi-label classification (i) problem transformation methods and (ii) algorithm adap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 29 publications
0
3
0
Order By: Relevance
“…Huang et al [17] explored multilabel learning, Fan et al [3] introduced MGAN for sentiment analysis, while Yang et al [8] and Yan et al [11] presented SLCABG and CNN-BiGRUAT models, respectively. Ishaq et al [15] leveraged CNN, semantic feature mining, and Word2Vec, and Ghani et al [16] addressed memory-intensive lazy learning methods, collectively enriching sentiment analysis and its applications.…”
Section: ░ 2 Literature Reviewmentioning
confidence: 99%
“…Huang et al [17] explored multilabel learning, Fan et al [3] introduced MGAN for sentiment analysis, while Yang et al [8] and Yan et al [11] presented SLCABG and CNN-BiGRUAT models, respectively. Ishaq et al [15] leveraged CNN, semantic feature mining, and Word2Vec, and Ghani et al [16] addressed memory-intensive lazy learning methods, collectively enriching sentiment analysis and its applications.…”
Section: ░ 2 Literature Reviewmentioning
confidence: 99%
“…The proposed approach reduced the computational complexity of the multi-label dataset and improved classification performance by eliminating the useless features using a heuristic forward approach. An improved k -nearest neighbor (kNN) method for multi-label classification with three modified strategies is represented in [9]. The three strategies are (i) Gaussian mixture models for splitting input space to multiple sub-spaces, (2) desirable labels, and (3) using unseen mutual/non-mutual examples in finding the local instances.…”
Section: Literature Reviewmentioning
confidence: 99%
“…However, only a few studies have employed MLC. In contrast to Single-Label Classification (SLC), or simply data classification, which associates an example with a single label, MLC allows an instance to be associated with multiple labels, thereby increasing the complexity of classification tasks [21,22]. Further details on this topic will be highlighted in the Background section.…”
Section: Introductionmentioning
confidence: 99%