2019
DOI: 10.1016/j.image.2019.03.010
|View full text |Cite
|
Sign up to set email alerts
|

Feature augmentation for imbalanced classification with conditional mixture WGANs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(16 citation statements)
references
References 16 publications
0
16
0
Order By: Relevance
“…Besides geometric transformations, more complex guided-augmentation methods may be used for DA, such as GAN [23]. In Reference [24], a general framework of DA using GANs in feature space for imbalanced classification is proposed. Experiments are conducted on three databases, i.e., SVHN, FER2013 and Amazon Review of Instant Video, showing the significant improvement with feature augmentation of GANs.…”
Section: Data Augmentation For Fermentioning
confidence: 99%
“…Besides geometric transformations, more complex guided-augmentation methods may be used for DA, such as GAN [23]. In Reference [24], a general framework of DA using GANs in feature space for imbalanced classification is proposed. Experiments are conducted on three databases, i.e., SVHN, FER2013 and Amazon Review of Instant Video, showing the significant improvement with feature augmentation of GANs.…”
Section: Data Augmentation For Fermentioning
confidence: 99%
“…• DELTA [13] • DIFA+cMWGAN: The adversarial feature augmentation method based on [15]. We modified the structures of GAN with cMWGAN [14].…”
Section: Results and Comparisonsmentioning
confidence: 99%
“…A feature extractor is trained with the source dataset under supervised learning, and then a feature generator for the unlabeled target dataset is trained in the convolutional GAN (CGAN) framework [19] against the feature extractor. Zhang et al [14] developed a more general feature generation framework for imbalanced classification, inspired by the adversarial feature augmentation approach. These methods can generate domain-invariant features without considering the modality of the data.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A few pioneering works propose generative-based feature augmentation approaches for domain adaptation [66], imbalanced classification [79], and few-shot learning [8]. Feature normalization plays an important role in neural network training [27,34,49,36].…”
Section: Related Workmentioning
confidence: 99%