2021
DOI: 10.48550/arxiv.2110.15273
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

OMASGAN: Out-of-Distribution Minimum Anomaly Score GAN for Sample Generation on the Boundary

Abstract: Generative models trained in an unsupervised manner may set high likelihood and low reconstruction loss to Out-of-Distribution (OoD) samples. This increases Type II errors and leads to missed anomalies, overall decreasing Anomaly Detection (AD) performance. In addition, AD models underperform due to the rarity of anomalies. To address these limitations, we propose the OoD Minimum Anomaly Score GAN (OMASGAN). OMASGAN generates, in a negative data augmentation manner, anomalous samples on the estimated distribut… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 14 publications
0
4
0
Order By: Relevance
“…According to OMASGAN [9], NDA is not useful for all of the GAN backbones. Thus, we first apply the NDA on StyleGAN2 and find that NDA results in worse performance on StyleGAN2, as shown in Table 8.…”
Section: Results Using Stylegan2 Backbonementioning
confidence: 99%
“…According to OMASGAN [9], NDA is not useful for all of the GAN backbones. Thus, we first apply the NDA on StyleGAN2 and find that NDA results in worse performance on StyleGAN2, as shown in Table 8.…”
Section: Results Using Stylegan2 Backbonementioning
confidence: 99%
“…NDA-GANs guide the discriminator to regard the out-of-distribution samples as "fake" instances to improve GANs training. A recent study, OMAS-GAN [7], shows that the performance of NDA varies on different datasets and backbones of GANs.…”
Section: Negative Data Augmentation (Nda)mentioning
confidence: 99%
“…By adding NDA on different DE-GANs, the FID only achieves limited improvement or even deteriorates. This is because the performance of NDA varies on different datasets and backbones of GANs [7]. In this case, directly applying NDA in DE-GANs could produce part of the in-distribution samples on the 100-shot-Obama dataset with the Style-GAN2 backbone.…”
Section: Adaptive Negative Data Augmentation (Anda)mentioning
confidence: 99%
See 1 more Smart Citation