Adaptively Growing Hierarchical Mixtures of Experts

Abstract

We propose a novel approach to automatically growing and pruning Hierarchical Mixtures of Experts. The constructive algorithm pro(cid:173) posed here enables large hierarchies consisting of several hundred experts to be trained effectively. We show that HME's trained by our automatic growing procedure yield better generalization per(cid:173) formance than traditional static and balanced hierarchies. Eval(cid:173) uation of the algorithm is performed (1) on vowel classification and (2) within a hybrid version of the JANUS r9] speech recog(cid:173) nition system using a subset of the Switchboard large-vocabulary speaker-independent continuous speech recognition database.

Cite

Text

Fritsch et al. "Adaptively Growing Hierarchical Mixtures of Experts." Neural Information Processing Systems, 1996.

Markdown

[Fritsch et al. "Adaptively Growing Hierarchical Mixtures of Experts." Neural Information Processing Systems, 1996.](https://mlanthology.org/neurips/1996/fritsch1996neurips-adaptively/)

BibTeX

@inproceedings{fritsch1996neurips-adaptively,
  title     = {{Adaptively Growing Hierarchical Mixtures of Experts}},
  author    = {Fritsch, Jürgen and Finke, Michael and Waibel, Alex},
  booktitle = {Neural Information Processing Systems},
  year      = {1996},
  pages     = {459-465},
  url       = {https://mlanthology.org/neurips/1996/fritsch1996neurips-adaptively/}
}