2022
DOI: 10.48550/arxiv.2205.11164
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Time-series Transformer Generative Adversarial Networks

Abstract: Many real-world tasks are plagued by limitations on data: in some instances very little data is available and in others, data is protected by privacy enforcing regulations (e.g. GDPR). We consider limitations posed specifically on time-series data and present a model that can generate synthetic time-series which can be used in place of real data. A model that generates synthetic time-series data has two objectives: 1) to capture the stepwise conditional distribution of real sequences, and 2) to faithfully mode… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 30 publications
0
3
0
Order By: Relevance
“…Xu Z [9] et al Yu Y [10] et al proposed a way to combine GAN and LSTM to achieve time series prediction; Yoon J [11] et al proposed to use the excellent characteristics of GAN in generative distribution for data generation, and constructed a TimeGAN network to achieve time series data prediction. On this basis, Srinivasan P [12] et al add Transformer to TimeGAN to improve the generation accuracy of GAN networks In view of the above problems and previous research, as shown above, with reference to xu Z's model combination method and Yoon J's TimeGAN network structure, this paper uses the GAN network module as the generation framework for mutant data, and integrates the data processing method of wavelet transformation to achieve time series data decomposition, simplify complex data and build a combined network to process different types of data to solve the problem that the deflection data composition is complex and difficult to predict.…”
Section: Related Workmentioning
confidence: 99%
“…Xu Z [9] et al Yu Y [10] et al proposed a way to combine GAN and LSTM to achieve time series prediction; Yoon J [11] et al proposed to use the excellent characteristics of GAN in generative distribution for data generation, and constructed a TimeGAN network to achieve time series data prediction. On this basis, Srinivasan P [12] et al add Transformer to TimeGAN to improve the generation accuracy of GAN networks In view of the above problems and previous research, as shown above, with reference to xu Z's model combination method and Yoon J's TimeGAN network structure, this paper uses the GAN network module as the generation framework for mutant data, and integrates the data processing method of wavelet transformation to achieve time series data decomposition, simplify complex data and build a combined network to process different types of data to solve the problem that the deflection data composition is complex and difficult to predict.…”
Section: Related Workmentioning
confidence: 99%
“…PSA-GAN (Jeha et al 2021) uses self-attention for long time-series generation and presents Context-FID, inspired by the FID score (Heusel et al 2017). TsT-GAN (Srinivasan and Knottenbelt 2022) integrates transformers with GANs. VAE-based methods include Variational Recurrent AutoEncoder (Fabius and Van Amersfoort 2014) for classical songs, Stochastic WaveNet (Lai et al 2018b) for adaptive prior distribution learning, and Stochastic TCN (Aksan and Hilliges 2019) merging ELBO with TCN (Bai, Kolter, and Koltun 2018).…”
Section: Time-series Generation and Issuesmentioning
confidence: 99%
“…Li X. et al (2022) successfully designed the generator and discriminator with transformer to synthesize long-sequence time-series signals. Srinivasan and Knottenbelt (2022) proposed the TST-GAN to solve the problem of errors accumulating over time when synthesizing temporal features. This model can accurately simulate the joint distribution of the entire time-series, and the generated time-series can be used instead 2023) also combined the convolutional networks and transformer in the adversarial training to preserve both global and local temporal features in the time-series generation.…”
Section: Related Workmentioning
confidence: 99%