2018
DOI: 10.48550/arxiv.1812.02463
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Anomaly detection with Wasserstein GAN

Abstract: Generative adversarial networks are a class of generative algorithms that have been widely used to produce state-of-the-art samples. In this paper, we investigate GAN to perform anomaly detection on time series dataset. In order to achieve this goal, a bibliography is made focusing on theoretical properties of GAN and GAN used for anomaly detection. A Wasserstein GAN has been chosen to learn the representation of normal data distribution and a stacked encoder with the generator performs the anomaly detection. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 5 publications
0
3
0
Order By: Relevance
“…GAN According to Dumoulin and Belghazi [21] and Uehara and Sato [22], existing GAN algorithms suffer from a vanishing gradient problem which leads to instability and model collapse due to the use of predefined adversarial loss function. Haloui and Gupta [23] used the derived approximation to the Wasserstein distance to improve the original GAN gradient-based loss function. The improved GAN algorithm is called WGAN.…”
Section: Related Workmentioning
confidence: 99%
“…GAN According to Dumoulin and Belghazi [21] and Uehara and Sato [22], existing GAN algorithms suffer from a vanishing gradient problem which leads to instability and model collapse due to the use of predefined adversarial loss function. Haloui and Gupta [23] used the derived approximation to the Wasserstein distance to improve the original GAN gradient-based loss function. The improved GAN algorithm is called WGAN.…”
Section: Related Workmentioning
confidence: 99%
“…The generator generates images that closely resemble the original samples, while the discriminator distinguishes between real samples and generated ones. Building upon GAN, several variant networks have been introduced, such as AnoGAN [17], GANomaly [18], skip-GAN [19], WGAN [20], etc., all demonstrating excellent performance in anomaly detection. Nevertheless, the training process of GAN models often encounters convergence difficulties.…”
Section: Related Workmentioning
confidence: 99%
“…Unfortunately, GANs tend to learn the mass of the underlying multimodal distribution well, focusing less towards the low probability regions, i.e. the tails, and have discernible problems with mode collapse [20,19].…”
Section: Related Work: Boundary Generationmentioning
confidence: 99%
“…At present, generative models based on invertible residual networks, such as [14,15], are lacking for unsupervised anomaly detection [16,17]. Anomaly detection techniques show discernible limitations for detecting anomalies near the support of multimodal distributions [18,19].…”
Section: Introductionmentioning
confidence: 99%