Hierarchies of Adaptive Experts
Abstract
In this paper we present a neural network architecture that discovers a recursive decomposition of its input space. Based on a generalization of the modular architecture of Jacobs, Jordan, Nowlan, and Hinton (1991), the architecture uses competition among networks to recursively split the input space into nested regions and to learn separate associative mappings within each region. The learning algorithm is shown to perform gradient ascent in a log likelihood function that captures the architecture's hierarchical structure.
Cite
Text
Jordan and Jacobs. "Hierarchies of Adaptive Experts." Neural Information Processing Systems, 1991.Markdown
[Jordan and Jacobs. "Hierarchies of Adaptive Experts." Neural Information Processing Systems, 1991.](https://mlanthology.org/neurips/1991/jordan1991neurips-hierarchies/)BibTeX
@inproceedings{jordan1991neurips-hierarchies,
title = {{Hierarchies of Adaptive Experts}},
author = {Jordan, Michael I. and Jacobs, Robert A.},
booktitle = {Neural Information Processing Systems},
year = {1991},
pages = {985-992},
url = {https://mlanthology.org/neurips/1991/jordan1991neurips-hierarchies/}
}