CURL: Contrastive Unsupervised Representations for Reinforcement Learning

Abstract

We present CURL: Contrastive Unsupervised Representations for Reinforcement Learning. CURL extracts high-level features from raw pixels using contrastive learning and performs off-policy control on top of the extracted features. CURL outperforms prior pixel-based methods, both model-based and model-free, on complex tasks in the DeepMind Control Suite and Atari Games showing 1.9x and 1.2x performance gains at the 100K environment and interaction steps benchmarks respectively. On the DeepMind Control Suite, CURL is the first image-based algorithm to nearly match the sample-efficiency of methods that use state-based features. Our code is open-sourced and available at https://www.github.com/MishaLaskin/curl.

Cite

Text

Laskin et al. "CURL: Contrastive Unsupervised Representations for Reinforcement Learning." International Conference on Machine Learning, 2020.

Markdown

[Laskin et al. "CURL: Contrastive Unsupervised Representations for Reinforcement Learning." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/laskin2020icml-curl/)

BibTeX

@inproceedings{laskin2020icml-curl,
  title     = {{CURL: Contrastive Unsupervised Representations for Reinforcement Learning}},
  author    = {Laskin, Michael and Srinivas, Aravind and Abbeel, Pieter},
  booktitle = {International Conference on Machine Learning},
  year      = {2020},
  pages     = {5639-5650},
  volume    = {119},
  url       = {https://mlanthology.org/icml/2020/laskin2020icml-curl/}
}