Enhancing the Reliability of Out-of-Distribution Image Detection in Neural Networks

Abstract

We consider the problem of detecting out-of-distribution images in neural networks. We propose ODIN, a simple and effective method that does not require any change to a pre-trained neural network. Our method is based on the observation that using temperature scaling and adding small perturbations to the input can separate the softmax score distributions of in- and out-of-distribution images, allowing for more effective detection. We show in a series of experiments that ODIN is compatible with diverse network architectures and datasets. It consistently outperforms the baseline approach by a large margin, establishing a new state-of-the-art performance on this task. For example, ODIN reduces the false positive rate from the baseline 34.7% to 4.3% on the DenseNet (applied to CIFAR-10 and Tiny-ImageNet) when the true positive rate is 95%.

Cite

Text

Liang et al. "Enhancing the Reliability of Out-of-Distribution Image Detection in Neural Networks." International Conference on Learning Representations, 2018.

Markdown

[Liang et al. "Enhancing the Reliability of Out-of-Distribution Image Detection in Neural Networks." International Conference on Learning Representations, 2018.](https://mlanthology.org/iclr/2018/liang2018iclr-enhancing/)

BibTeX

@inproceedings{liang2018iclr-enhancing,
  title     = {{Enhancing the Reliability of Out-of-Distribution Image Detection in Neural Networks}},
  author    = {Liang, Shiyu and Li, Yixuan and Srikant, R.},
  booktitle = {International Conference on Learning Representations},
  year      = {2018},
  url       = {https://mlanthology.org/iclr/2018/liang2018iclr-enhancing/}
}