Slow, Decorrelated Features for Pretraining Complex Cell-like Networks

Abstract

We introduce a new type of neural network activation function based on recent physiological rate models for complex cells in visual area V1. A single-hidden-layer neural network of this kind of model achieves 1.5% error on MNIST. We also introduce an existing criterion for learning slow, decorrelated features as a pretraining strategy for image models. This pretraining strategy results in orientation-selective features, similar to the receptive fields of complex cells. With this pretraining, the same single-hidden-layer model achieves better generalization error, even though the pretraining sample distribution is very different from the fine-tuning distribution. To implement this pretraining strategy, we derive a fast algorithm for online learning of decorrelated features such that each iteration of the algorithm runs in linear time with respect to the number of features.

Cite

Text

Bengio and Bergstra. "Slow, Decorrelated Features for Pretraining Complex Cell-like Networks." Neural Information Processing Systems, 2009.

Markdown

[Bengio and Bergstra. "Slow, Decorrelated Features for Pretraining Complex Cell-like Networks." Neural Information Processing Systems, 2009.](https://mlanthology.org/neurips/2009/bengio2009neurips-slow/)

BibTeX

@inproceedings{bengio2009neurips-slow,
  title     = {{Slow, Decorrelated Features for Pretraining Complex Cell-like Networks}},
  author    = {Bengio, Yoshua and Bergstra, James S.},
  booktitle = {Neural Information Processing Systems},
  year      = {2009},
  pages     = {99-107},
  url       = {https://mlanthology.org/neurips/2009/bengio2009neurips-slow/}
}