Constructive Algorithms for Hierarchical Mixtures of Experts

Abstract

We present two additions to the hierarchical mixture of experts (HME) architecture. By applying a likelihood splitting criteria to each expert in the HME we "grow" the tree adaptively during train(cid:173) ing. Secondly, by considering only the most probable path through the tree we may "prune" branches away, either temporarily, or per(cid:173) manently if they become redundant. We demonstrate results for the growing and path pruning algorithms which show significant speed ups and more efficient use of parameters over the standard fixed structure in discriminating between two interlocking spirals and classifying 8-bit parity patterns.

Cite

Text

Waterhouse and Robinson. "Constructive Algorithms for Hierarchical Mixtures of Experts." Neural Information Processing Systems, 1995.

Markdown

[Waterhouse and Robinson. "Constructive Algorithms for Hierarchical Mixtures of Experts." Neural Information Processing Systems, 1995.](https://mlanthology.org/neurips/1995/waterhouse1995neurips-constructive/)

BibTeX

@inproceedings{waterhouse1995neurips-constructive,
  title     = {{Constructive Algorithms for Hierarchical Mixtures of Experts}},
  author    = {Waterhouse, Steve R. and Robinson, Anthony J.},
  booktitle = {Neural Information Processing Systems},
  year      = {1995},
  pages     = {584-590},
  url       = {https://mlanthology.org/neurips/1995/waterhouse1995neurips-constructive/}
}