DOI: 10.23860/thesis-zhang-yazhou-2018
|View full text |Cite
|
Sign up to set email alerts
|

Deep Generative Model for Multi-Class Imbalanced Learning

Abstract: Label efficiency has become an increasingly important objective in deep learning applications. Active learning aims to reduce the number of labeled examples needed to train deep networks, but the empirical performance of active learning algorithms can vary dramatically across datasets and applications. It is difficult to know in advance which active learning strategy will perform well or best in a given application. To address this, we propose the first adaptive algorithm selection strategy for deep active lea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(10 citation statements)
references
References 41 publications
0
10
0
Order By: Relevance
“…By leveraging the synthetic sample generation capabilities of GANs. The theoretical foundation and implementation strategies of this GAN-based balancing approach, highlighting its potential to improve classification performance on all classes [1,30].…”
Section: Gans Based Balancingmentioning
confidence: 99%
See 1 more Smart Citation
“…By leveraging the synthetic sample generation capabilities of GANs. The theoretical foundation and implementation strategies of this GAN-based balancing approach, highlighting its potential to improve classification performance on all classes [1,30].…”
Section: Gans Based Balancingmentioning
confidence: 99%
“…Triple-GANs (TGAN) is a three character architecture implemented to solve the problem of imbalanced dataset using oversampling technique. It uses an adversarial relationship between classifier (a), generator (g) and discriminator (d) [30]. Though the generation process is benefited but the classification is unable to learn the things generated due to adversarial relationship.…”
Section: Gans Based Balancingmentioning
confidence: 99%
“…Other methods. Other paradigms include task-specific architecture design (Wang et al, 2021c;Zhou et al, 2020;Wang et al, 2021a), transfer learning (Liu et al, 2019;Yin et al, 2019), domain adaptation (Jamal et al, 2020), semisupervised learning, and self-supervised learning (Yang and Xu, 2020) which demand nontrivial architecture design or external data.…”
Section: Imbalanced Learningmentioning
confidence: 99%
“…The zero-shot classification performance of VLMs on imbalanced datasets is limited due to several factors, including the inherent bias in the pre-training data, the lack of exposure to the tail classes during pre-training, and the lack of techniques to mitigate the effects of class imbalance (Schuhmann et al, 2022). As a result, VLMs often perform poorly on tail classes, which can be critical for many applications related to safety or health, such as autonomous driving and medical diagnosis (Yang and Xu, 2020). Hence, it is intuitive to ask: is this long-tailed pre-training style actually transfer or influence downstream long-tailed classes?…”
Section: Introductionmentioning
confidence: 99%
“…However, standard datasets available for such tasks are typically of high dimensionality and unbalanced, where each class label is not equally represented. This imbalance can potentially impact the performance of the classifier [8,9].…”
Section: Introductionmentioning
confidence: 99%