SimPer: Simple Self-Supervised Learning of Periodic Targets
Abstract
From human physiology to environmental evolution, important processes in nature often exhibit meaningful and strong periodic or quasi-periodic changes. Due to their inherent label scarcity, learning useful representations for periodic tasks with limited or no supervision is of great benefit. Yet, existing self-supervised learning (SSL) methods overlook the intrinsic periodicity in data, and fail to learn representations that capture periodic or frequency attributes. In this paper, we present SimPer, a simple contrastive SSL regime for learning periodic information in data. To exploit the periodic inductive bias, SimPer introduces customized augmentations, feature similarity measures, and a generalized contrastive loss for learning efficient and robust periodic representations. Extensive experiments on common real-world tasks in human behavior analysis, environmental sensing, and healthcare domains verify the superior performance of SimPer compared to state-of-the-art SSL methods, highlighting its intriguing properties including better data efficiency, robustness to spurious correlations, and generalization to distribution shifts.
Cite
Text
Yang et al. "SimPer: Simple Self-Supervised Learning of Periodic Targets." International Conference on Learning Representations, 2023.Markdown
[Yang et al. "SimPer: Simple Self-Supervised Learning of Periodic Targets." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/yang2023iclr-simper/)BibTeX
@inproceedings{yang2023iclr-simper,
title = {{SimPer: Simple Self-Supervised Learning of Periodic Targets}},
author = {Yang, Yuzhe and Liu, Xin and Wu, Jiang and Borac, Silviu and Katabi, Dina and Poh, Ming-Zher and McDuff, Daniel},
booktitle = {International Conference on Learning Representations},
year = {2023},
url = {https://mlanthology.org/iclr/2023/yang2023iclr-simper/}
}