PSA-GAN: Progressive Self Attention GANs for Synthetic Time Series
Abstract
Realistic synthetic time series data of sufficient length enables practical applications in time series modeling tasks, such as forecasting, but remains a challenge. In this paper we present PSA-GAN, a generative adversarial network (GAN) that generates long time series samples of high quality using progressive growing of GANs and self-attention. We show that PSA-GAN can be used to reduce the error in several downstream forecasting tasks over baselines that only use real data. We also introduce a Frechet-Inception Distance-like score for time series, Context-FID, assessing the quality of synthetic time series samples. We find that Context-FID is indicative for downstream performance. Therefore, Context-FID could be a useful tool to develop time series GAN models.
Cite
Text
Jeha et al. "PSA-GAN: Progressive Self Attention GANs for Synthetic Time Series." International Conference on Learning Representations, 2022.Markdown
[Jeha et al. "PSA-GAN: Progressive Self Attention GANs for Synthetic Time Series." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/jeha2022iclr-psagan/)BibTeX
@inproceedings{jeha2022iclr-psagan,
title = {{PSA-GAN: Progressive Self Attention GANs for Synthetic Time Series}},
author = {Jeha, Paul and Bohlke-Schneider, Michael and Mercado, Pedro and Kapoor, Shubham and Nirwan, Rajbir Singh and Flunkert, Valentin and Gasthaus, Jan and Januschowski, Tim},
booktitle = {International Conference on Learning Representations},
year = {2022},
url = {https://mlanthology.org/iclr/2022/jeha2022iclr-psagan/}
}