Mixture of Experts Classification Using a Hierarchical Mixture Model
Abstract
A three-level hierarchical mixture model for classification is presented that models the following data generation process: (1) the data are generated by a finite number of sources (clusters), and (2) the generation mechanism of each source assumes the existence of individual internal class-labeled sources (subclusters of the external cluster). The model estimates the posterior probability of class membership similar to a mixture of experts classifier. In order to learn the parameters of the model, we have developed a general training approach based on maximum likelihood that results in two efficient training algorithms. Compared to other classification mixture models, the proposed hierarchical model exhibits several advantages and provides improved classification performance as indicated by the experimental results.
Cite
Text
Titsias and Likas. "Mixture of Experts Classification Using a Hierarchical Mixture Model." Neural Computation, 2002. doi:10.1162/089976602320264060Markdown
[Titsias and Likas. "Mixture of Experts Classification Using a Hierarchical Mixture Model." Neural Computation, 2002.](https://mlanthology.org/neco/2002/titsias2002neco-mixture/) doi:10.1162/089976602320264060BibTeX
@article{titsias2002neco-mixture,
title = {{Mixture of Experts Classification Using a Hierarchical Mixture Model}},
author = {Titsias, Michalis K. and Likas, Aristidis},
journal = {Neural Computation},
year = {2002},
pages = {2221-2244},
doi = {10.1162/089976602320264060},
volume = {14},
url = {https://mlanthology.org/neco/2002/titsias2002neco-mixture/}
}