Learning Deep Classifiers Consistent with Fine-Grained Novelty Detection

Abstract

The problem of novelty detection in fine-grained visual classification (FGVC) is considered. An integrated understanding of the probabilistic and distance-based approaches to novelty detection is developed within the framework of convolutional neural networks (CNNs). It is shown that softmax CNN classifiers are inconsistent with novelty detection, because their learned class-conditional distributions and associated distance metrics are unidentifiable. A new regularization constraint, the class-conditional Gaussianity loss, is then proposed to eliminate this unidentifiability, and enforce Gaussian class-conditional distributions. This enables training Novelty Detection Consistent Classifiers (NDCCs) that are jointly optimal for classification and novelty detection. Empirical evaluations show that NDCCs achieve significant improvements over the state-of-the-art on both small- and large-scale FGVC datasets.

Cite

Text

Cheng and Vasconcelos. "Learning Deep Classifiers Consistent with Fine-Grained Novelty Detection." Conference on Computer Vision and Pattern Recognition, 2021. doi:10.1109/CVPR46437.2021.00171

Markdown

[Cheng and Vasconcelos. "Learning Deep Classifiers Consistent with Fine-Grained Novelty Detection." Conference on Computer Vision and Pattern Recognition, 2021.](https://mlanthology.org/cvpr/2021/cheng2021cvpr-learning-a/) doi:10.1109/CVPR46437.2021.00171

BibTeX

@inproceedings{cheng2021cvpr-learning-a,
  title     = {{Learning Deep Classifiers Consistent with Fine-Grained Novelty Detection}},
  author    = {Cheng, Jiacheng and Vasconcelos, Nuno},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2021},
  pages     = {1664-1673},
  doi       = {10.1109/CVPR46437.2021.00171},
  url       = {https://mlanthology.org/cvpr/2021/cheng2021cvpr-learning-a/}
}