Boosted Mixture of Experts: An Ensemble Learning Scheme

Abstract

We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classifier combination model. This procedure may be viewed as either a version of mixture of experts (Jacobs, Jordan, Nowlan, & Hinton, 1991), applied to classification, or a variant of the boosting algorithm (Schapire, 1990). As a variant of the mixture of experts, it can be made appropriate for general classification and regression problems by initializing the partition of the data set to different experts in a boostlike manner. If viewed as a variant of the boosting algorithm, its main gain is the use of a dynamic combination model for the outputs of the networks. Results are demonstrated on a synthetic example and a digit recognition task from the NIST database and compared with classical ensemble approaches.

Cite

Text

Avnimelech and Intrator. "Boosted Mixture of Experts: An Ensemble Learning Scheme." Neural Computation, 1999. doi:10.1162/089976699300016737

Markdown

[Avnimelech and Intrator. "Boosted Mixture of Experts: An Ensemble Learning Scheme." Neural Computation, 1999.](https://mlanthology.org/neco/1999/avnimelech1999neco-boosted/) doi:10.1162/089976699300016737

BibTeX

@article{avnimelech1999neco-boosted,
  title     = {{Boosted Mixture of Experts: An Ensemble Learning Scheme}},
  author    = {Avnimelech, Ran and Intrator, Nathan},
  journal   = {Neural Computation},
  year      = {1999},
  pages     = {483-497},
  doi       = {10.1162/089976699300016737},
  volume    = {11},
  url       = {https://mlanthology.org/neco/1999/avnimelech1999neco-boosted/}
}