Out-of-Distribution Generalization via Risk Extrapolation (REx)

Abstract

Distributional shift is one of the major obstacles when transferring machine learning prediction systems from the lab to the real world. To tackle this problem, we assume that variation across training domains is representative of the variation we might encounter at test time, but also that shifts at test time may be more extreme in magnitude. In particular, we show that reducing differences in risk across training domains can reduce a model’s sensitivity to a wide range of extreme distributional shifts, including the challenging setting where the input contains both causal and anti-causal elements. We motivate this approach, Risk Extrapolation (REx), as a form of robust optimization over a perturbation set of extrapolated domains (MM-REx), and propose a penalty on the variance of training risks (V-REx) as a simpler variant. We prove that variants of REx can recover the causal mechanisms of the targets, while also providing robustness to changes in the input distribution (“covariate shift”). By appropriately trading-off robustness to causally induced distributional shifts and covariate shift, REx is able to outperform alternative methods such as Invariant Risk Minimization in situations where these types of shift co-occur.

Cite

Text

Krueger et al. "Out-of-Distribution Generalization via Risk Extrapolation (REx)." International Conference on Machine Learning, 2021.

Markdown

[Krueger et al. "Out-of-Distribution Generalization via Risk Extrapolation (REx)." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/krueger2021icml-outofdistribution/)

BibTeX

@inproceedings{krueger2021icml-outofdistribution,
  title     = {{Out-of-Distribution Generalization via Risk Extrapolation (REx)}},
  author    = {Krueger, David and Caballero, Ethan and Jacobsen, Joern-Henrik and Zhang, Amy and Binas, Jonathan and Zhang, Dinghuai and Le Priol, Remi and Courville, Aaron},
  booktitle = {International Conference on Machine Learning},
  year      = {2021},
  pages     = {5815-5826},
  volume    = {139},
  url       = {https://mlanthology.org/icml/2021/krueger2021icml-outofdistribution/}
}