Know Your Limits: Uncertainty Estimation with ReLU Classifiers Fails at Reliable OOD Detection
Abstract
A crucial requirement for reliable deployment of deep learning models for safety-critical applications is the ability to identify out-of-distribution (OOD) data points, samples which differ from the training data and on which a model might underperform. Previous work has attempted to tackle this problem using uncertainty estimation techniques. However, there is empirical evidence that a large family of these techniques do not detect OOD reliably in classification tasks. This paper gives a theoretical explanation for said experimental findings and illustrates it on synthetic data. We prove that such techniques are not able to reliably identify OOD samples in a classification setting, since their level of confidence is generalized to unseen areas of the feature space. This result stems from the interplay between the representation of ReLU networks as piece-wise affine transformations, the saturating nature of activation functions like softmax, and the most widely-used uncertainty metrics.
Cite
Text
Ulmer and Cinà. "Know Your Limits: Uncertainty Estimation with ReLU Classifiers Fails at Reliable OOD Detection." Uncertainty in Artificial Intelligence, 2021.Markdown
[Ulmer and Cinà. "Know Your Limits: Uncertainty Estimation with ReLU Classifiers Fails at Reliable OOD Detection." Uncertainty in Artificial Intelligence, 2021.](https://mlanthology.org/uai/2021/ulmer2021uai-know/)BibTeX
@inproceedings{ulmer2021uai-know,
title = {{Know Your Limits: Uncertainty Estimation with ReLU Classifiers Fails at Reliable OOD Detection}},
author = {Ulmer, Dennis and Cinà, Giovanni},
booktitle = {Uncertainty in Artificial Intelligence},
year = {2021},
pages = {1766-1776},
volume = {161},
url = {https://mlanthology.org/uai/2021/ulmer2021uai-know/}
}