Improving Generalization via Scalable Neighborhood Component Analysis
Abstract
Current visual recognition is dominated by the end-to-end formulation of classification problems implemented by the parametric softmax classifiers. Such formulation makes a closed world assumption with a fixed set of categories. This becomes problematic for open-set scenarios where new categories are encountered with very few examples for learning a generalizable parametric classifier. This paper adopts a non-parametric approach for visual recognition by optimizing feature embeddings instead of parametric classifiers. We use a deep neural network to learn embeddings which preserves neighborhood structures by neighborhood component analysis (NCA). Limited by its computational bottlenecks, we devise a mechanism to use an augmented memory to scale NCA for large datasets and very deep neural networks. Our experimental results show state-of-the-art results on ImageNet classification using nearest neighbor classifiers. More importantly, our feature embedding is more generalizable for new categories such as sub-category discovery and few-shot recognition.
Cite
Text
Wu et al. "Improving Generalization via Scalable Neighborhood Component Analysis." Proceedings of the European Conference on Computer Vision (ECCV), 2018. doi:10.1007/978-3-030-01234-2_42Markdown
[Wu et al. "Improving Generalization via Scalable Neighborhood Component Analysis." Proceedings of the European Conference on Computer Vision (ECCV), 2018.](https://mlanthology.org/eccv/2018/wu2018eccv-improving/) doi:10.1007/978-3-030-01234-2_42BibTeX
@inproceedings{wu2018eccv-improving,
title = {{Improving Generalization via Scalable Neighborhood Component Analysis}},
author = {Wu, Zhirong and Efros, Alexei A. and Yu, Stella X.},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2018},
doi = {10.1007/978-3-030-01234-2_42},
url = {https://mlanthology.org/eccv/2018/wu2018eccv-improving/}
}