C-GATS: Conditional Generation of Anomalous Time Series

Abstract

Sparsity of the data needed to learn about the anomalies is often a key challenge that is faced when it comes to training deep supervised models for the task of Anomaly Detection (AD). Generating synthetic data by applying pre-determined transformations that conform to a set of known invariances has shown to improve performance of such deep models. In this work we present C-GATS to show that one can learn a much larger invariance space using the available sparse data by training a conditional generative model to do Data Augmentation (DA) for anomalous Time Series (TS) in a model-agnostic way. Particularly, we factorize an anomalous TS sequence into 3 attributes— normal sub-sequence, anomalous sub-sequence, and position of the anomaly and model each of them separately. This factorization helps exploit samples from the dominant class i.e normal TS to train a generative model for the sparse class i.e anomalous TS. We provide an exhaustive study to showcase that C-GATS not only learns to generate different types of anomalies (eg: point anomalies and level-shift) but those generated anomalies improve performance of multiple SOTA TS AD models on a set of popular public TS AD benchmark datasets.

Cite

Text

Singh et al. "C-GATS: Conditional Generation of Anomalous Time Series." NeurIPS 2022 Workshops: SyntheticData4ML, 2022.

Markdown

[Singh et al. "C-GATS: Conditional Generation of Anomalous Time Series." NeurIPS 2022 Workshops: SyntheticData4ML, 2022.](https://mlanthology.org/neuripsw/2022/singh2022neuripsw-cgats/)

BibTeX

@inproceedings{singh2022neuripsw-cgats,
  title     = {{C-GATS: Conditional Generation of Anomalous Time Series}},
  author    = {Singh, Vikramank and Sankararaman, Abishek and Balakrishnan, Murali and Song, Zhao},
  booktitle = {NeurIPS 2022 Workshops: SyntheticData4ML},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/singh2022neuripsw-cgats/}
}