Learning Temporally Persistent Hierarchical Representations
Abstract
A biologically motivated model of cortical self-organization is pro(cid:173) posed. Context is combined with bottom-up information via a maximum likelihood cost function. Clusters of one or more units are modulated by a common contextual gating Signal; they thereby organize themselves into mutually supportive predictors of abstract contextual features. The model was tested in its ability to discover viewpoint-invariant classes on a set of real image sequences of cen(cid:173) tered, gradually rotating faces. It performed considerably better than supervised back-propagation at generalizing to novel views from a small number of training examples.
Cite
Text
Becker. "Learning Temporally Persistent Hierarchical Representations." Neural Information Processing Systems, 1996.Markdown
[Becker. "Learning Temporally Persistent Hierarchical Representations." Neural Information Processing Systems, 1996.](https://mlanthology.org/neurips/1996/becker1996neurips-learning/)BibTeX
@inproceedings{becker1996neurips-learning,
title = {{Learning Temporally Persistent Hierarchical Representations}},
author = {Becker, Suzanna},
booktitle = {Neural Information Processing Systems},
year = {1996},
pages = {824-830},
url = {https://mlanthology.org/neurips/1996/becker1996neurips-learning/}
}