Self-Supervised Learning of Disentangled Representations for Multivariate Time-Series

Abstract

Multivariate time-series data in fields like healthcare and industry are informative but challenging due to high dimensionality and lack of labels. Recent self-supervised learning methods excel in learning rich representations without labels but struggle with disentangled embeddings and inductive bias issues like transformation-invariance. To address these challenges, we introduce TimeDRL, a framework for multivariate time-series representation learning with dual-level disentangled embeddings. TimeDRL features: (i) disentangled timestamp-level and instance-level embeddings using a [CLS] token strategy; (ii) timestamp-predictive and instance-contrastive tasks for representation learning; and (iii) avoidance of augmentation methods to eliminate inductive biases. Experiments on forecasting and classification datasets show TimeDRL outperforms existing methods, with further validation in semi-supervised settings with limited labeled data.

Cite

Text

Chang et al. "Self-Supervised Learning of Disentangled Representations for Multivariate Time-Series." NeurIPS 2024 Workshops: SSL, 2024.

Markdown

[Chang et al. "Self-Supervised Learning of Disentangled Representations for Multivariate Time-Series." NeurIPS 2024 Workshops: SSL, 2024.](https://mlanthology.org/neuripsw/2024/chang2024neuripsw-selfsupervised/)

BibTeX

@inproceedings{chang2024neuripsw-selfsupervised,
  title     = {{Self-Supervised Learning of Disentangled Representations for Multivariate Time-Series}},
  author    = {Chang, Ching and Chiao-Tung, Chan and Wang, Wei-Yao and Peng, Wen-Chih and Chen, Tien-Fu},
  booktitle = {NeurIPS 2024 Workshops: SSL},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/chang2024neuripsw-selfsupervised/}
}