Learning a Nonlinear Embedding by Preserving Class Neighbourhood Structure

Abstract

We show how to pretrain and fine-tune a multilayer neural network to learn a nonlinear transformation from the input space to a low-dimensional feature space in which K-nearest neighbour classification performs well. We also show how the non-linear transformation can be improved using unlabeled data. Our method achieves a much lower error rate than Support Vector Machines or standard backpropagation on a widely used version of the MNIST handwritten digit recognition task. If some of the dimensions of the low-dimensional feature space are not used for nearest neighbor classification, our method uses these dimensions to explicitly represent transformations of the digits that do not affect their identity.

Cite

Text

Salakhutdinov and Hinton. "Learning a Nonlinear Embedding by Preserving Class Neighbourhood Structure." Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, 2007.

Markdown

[Salakhutdinov and Hinton. "Learning a Nonlinear Embedding by Preserving Class Neighbourhood Structure." Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, 2007.](https://mlanthology.org/aistats/2007/salakhutdinov2007aistats-learning/)

BibTeX

@inproceedings{salakhutdinov2007aistats-learning,
  title     = {{Learning a Nonlinear Embedding by Preserving Class Neighbourhood Structure}},
  author    = {Salakhutdinov, Ruslan and Hinton, Geoff},
  booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics},
  year      = {2007},
  pages     = {412-419},
  volume    = {2},
  url       = {https://mlanthology.org/aistats/2007/salakhutdinov2007aistats-learning/}
}