The NBNN Kernel

Abstract

Naive Bayes Nearest Neighbor (NBNN) has recently been proposed as a powerful, non-parametric approach for object classification, that manages to achieve remarkably good results thanks to the avoidance of a vector quantization step and the use of image-to-class comparisons, yielding good generalization. In this paper, we introduce a kernelized version of NBNN. This way, we can learn the classifier in a discriminative setting. Moreover, it then becomes straightforward to combine it with other kernels. In particular, we show that our NBNN kernel is complementary to standard bag-of-features based kernels, focussing on local generalization as opposed to global image composition. By combining them, we achieve state-of-the-art results on Caltech101 and 15 Scenes datasets. As a side contribution, we also investigate how to speed up the NBNN computations. © 2011 IEEE.

Cite

Text

Tuytelaars et al. "The NBNN Kernel." IEEE/CVF International Conference on Computer Vision, 2011. doi:10.1109/ICCV.2011.6126449

Markdown

[Tuytelaars et al. "The NBNN Kernel." IEEE/CVF International Conference on Computer Vision, 2011.](https://mlanthology.org/iccv/2011/tuytelaars2011iccv-nbnn/) doi:10.1109/ICCV.2011.6126449

BibTeX

@inproceedings{tuytelaars2011iccv-nbnn,
  title     = {{The NBNN Kernel}},
  author    = {Tuytelaars, Tinne and Fritz, Mario and Saenko, Kate and Darrell, Trevor},
  booktitle = {IEEE/CVF International Conference on Computer Vision},
  year      = {2011},
  pages     = {1824-1831},
  doi       = {10.1109/ICCV.2011.6126449},
  url       = {https://mlanthology.org/iccv/2011/tuytelaars2011iccv-nbnn/}
}