2022
DOI: 10.1007/978-3-031-09342-5_13
|View full text |Cite
|
Sign up to set email alerts
|

TTS-GAN: A Transformer-Based Time-Series Generative Adversarial Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
41
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 77 publications
(41 citation statements)
references
References 14 publications
0
41
0
Order By: Relevance
“…The data augmentation method is a process of artificially increasing the amount of data by generating new data points from existing data that does not require substantial training data, including synthetic minority oversampling technique (SMOTE) [ 41 ], transformers [ 42 ], auto-Encoder [ 43 ], generative adversarial network (GAN) [ 44 ]. We have started experimentation with GAN for time series data in [ 45 ]. Recently, we have also used a GAN product from Gretel ( , accessed on 25 December 2022) to generate synthetic data.…”
Section: Discussionmentioning
confidence: 99%
“…The data augmentation method is a process of artificially increasing the amount of data by generating new data points from existing data that does not require substantial training data, including synthetic minority oversampling technique (SMOTE) [ 41 ], transformers [ 42 ], auto-Encoder [ 43 ], generative adversarial network (GAN) [ 44 ]. We have started experimentation with GAN for time series data in [ 45 ]. Recently, we have also used a GAN product from Gretel ( , accessed on 25 December 2022) to generate synthetic data.…”
Section: Discussionmentioning
confidence: 99%
“…Further, Transformer-based GANs [73], [74] were proposed and showed the robustness of training. On the other hand, there are a few works that considered generating sequences using GAN, including the RNN with reinforcement training [75], [76], and the Transformerbased generator for text generation [77], [78], even long sequence generation [79], [80].…”
Section: B Generative Adversarial Networkmentioning
confidence: 99%
“…TSG models have been developed to overcome insufficient training data due, for example, to acquisition difficulties or strict privacy constraints (Li et al, 2015;Esteban et al, 2017;Yoon et al, 2019;Ni et al, 2020a;Smith and Smith, 2020;Li et al, 2022;Zha, 2022). Current mainstream TSG methods consist of combining RNNs with the GAN architecture, as done in RCGAN (Esteban et al, 2017), TimeGAN (Yoon et al, 2019), and SigCWGAN (Ni et al, 2020a).…”
Section: Introductionmentioning
confidence: 99%
“…However, such methods are unable to effectively produce long sequences, given the limitation of the RNN model in keeping track of temporal dependencies between inputs that are distant in time (Vaswani et al, 2017). This problem is tackled by Li et al (2022) with TTS-CGAN (Transformer Time-Series Conditional GAN), where the RNN model is replaced by a transformer model (Vaswani et al, 2017). The challenges of training GANs, such as non-convergence, mode collapse, generator-discriminator unbalance, and sensitive hyperparameter selection, however, still remain.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation