CLOCS: Contrastive Learning of Cardiac Signals Across Space, Time, and Patients
Abstract
The healthcare industry generates troves of unlabelled physiological data. This data can be exploited via contrastive learning, a self-supervised pre-training method that encourages representations of instances to be similar to one another. We propose a family of contrastive learning methods, CLOCS, that encourages representations across space, time, \textit{and} patients to be similar to one another. We show that CLOCS consistently outperforms the state-of-the-art methods, BYOL and SimCLR, when performing a linear evaluation of, and fine-tuning on, downstream tasks. We also show that CLOCS achieves strong generalization performance with only 25% of labelled training data. Furthermore, our training procedure naturally generates patient-specific representations that can be used to quantify patient-similarity.
Cite
Text
Kiyasseh et al. "CLOCS: Contrastive Learning of Cardiac Signals Across Space, Time, and Patients." International Conference on Machine Learning, 2021.Markdown
[Kiyasseh et al. "CLOCS: Contrastive Learning of Cardiac Signals Across Space, Time, and Patients." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/kiyasseh2021icml-clocs/)BibTeX
@inproceedings{kiyasseh2021icml-clocs,
title = {{CLOCS: Contrastive Learning of Cardiac Signals Across Space, Time, and Patients}},
author = {Kiyasseh, Dani and Zhu, Tingting and Clifton, David A},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {5606-5615},
volume = {139},
url = {https://mlanthology.org/icml/2021/kiyasseh2021icml-clocs/}
}