Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation

Abstract

We propose a self-supervised method for pre-training universal time series representations in which we learn contrastive representations using similarity distillation along the temporal and instance dimensions. We analyze the effectiveness of both dimensions, and evaluate our pre-trained representations on three downstream tasks: time series classification, anomaly detection, and forecasting.

Cite

Text

Hajimoradlou et al. "Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation." ICML 2022 Workshops: Pre-Training, 2022.

Markdown

[Hajimoradlou et al. "Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation." ICML 2022 Workshops: Pre-Training, 2022.](https://mlanthology.org/icmlw/2022/hajimoradlou2022icmlw-selfsupervised/)

BibTeX

@inproceedings{hajimoradlou2022icmlw-selfsupervised,
  title     = {{Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation}},
  author    = {Hajimoradlou, Ainaz and Pishdad, Leila and Tung, Frederick and Karpusha, Maryna},
  booktitle = {ICML 2022 Workshops: Pre-Training},
  year      = {2022},
  url       = {https://mlanthology.org/icmlw/2022/hajimoradlou2022icmlw-selfsupervised/}
}