2020 25th International Conference on Pattern Recognition (ICPR) 2021
DOI: 10.1109/icpr48806.2021.9413249
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Mixtures of Generators for Adversarial Learning

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…Here, we use adversarial learning to enable z c to include common features of EEG signals, but not class-specific features. Unlike other studies [54], [55], no generator is required because input data distributions are not utilized in this study. The training objective is to train f c for extraction of common features.…”
Section: B Adversarial Learningmentioning
confidence: 99%
“…Here, we use adversarial learning to enable z c to include common features of EEG signals, but not class-specific features. Unlike other studies [54], [55], no generator is required because input data distributions are not utilized in this study. The training objective is to train f c for extraction of common features.…”
Section: B Adversarial Learningmentioning
confidence: 99%
“…When working with high-dimensional data, including images and text, this technique becomes more difficult. In order to achieve this, we employ generative adversarial networks [ 5 ], which offer a mapping from the latent space to the high-dimensional data. GAN’s capacity to extract information from all types of image data has led to more promising findings in the segmentation of MRI data, such as when using MRI data to segment CT images [ 6 ].…”
Section: Introductionmentioning
confidence: 99%