Soft Contrastive Learning for Time Series

Abstract

Contrastive learning has shown to be effective to learn representations from time series in a self-supervised way. However, contrasting similar time series instances or values from adjacent timestamps within a time series leads to ignore their inherent correlations, which results in deteriorating the quality of learned representations. To address this issue, we propose \textit{SoftCLT}, a simple yet effective soft contrastive learning strategy for time series. This is achieved by introducing instance-wise and temporal contrastive loss with soft assignments ranging from zero to one. Specifically, we define soft assignments for 1) instance-wise contrastive loss by distance between time series on the data space, warping and 2) temporal contrastive loss by the difference of timestamps. SoftCLT is a plug-and-play method for time series contrastive learning that improves the quality of learned representations without bells and whistles. In experiments, we demonstrate that SoftCLT consistently improves the performance in various downstream tasks including classification, semi-supervised learning, transfer learning, and anomaly detection, showing state-of-the-art performance. Code is available at this repository: https://github.com/seunghan96/softclt.

Cite

Text

Lee et al. "Soft Contrastive Learning for Time Series." International Conference on Learning Representations, 2024.

Markdown

[Lee et al. "Soft Contrastive Learning for Time Series." International Conference on Learning Representations, 2024.](https://mlanthology.org/iclr/2024/lee2024iclr-soft/)

BibTeX

@inproceedings{lee2024iclr-soft,
  title     = {{Soft Contrastive Learning for Time Series}},
  author    = {Lee, Seunghan and Park, Taeyoung and Lee, Kibok},
  booktitle = {International Conference on Learning Representations},
  year      = {2024},
  url       = {https://mlanthology.org/iclr/2024/lee2024iclr-soft/}
}