Quadratically Gated Mixture of Experts for Incomplete Data Classification

Abstract

We introduce quadratical ly gated mixture of experts (QGME), a statistical model for multi-class nonlinear classification. The QGME is formulated in the setting of incomplete data, where the data values are partially observed. We show that the missing values entail joint estimation of the data manifold and the classifier, which allows adaptive imputation during classifier learning. The expectation maximization (EM) algorithm is derived for joint likelihood maximization, with adaptive imputation performed analytically in the E-step. The performance of QGME is evaluated on three benchmark data sets and the results show that the QGME yields significant improvements over competing methods.

Cite

Text

Liao et al. "Quadratically Gated Mixture of Experts for Incomplete Data Classification." International Conference on Machine Learning, 2007. doi:10.1145/1273496.1273566

Markdown

[Liao et al. "Quadratically Gated Mixture of Experts for Incomplete Data Classification." International Conference on Machine Learning, 2007.](https://mlanthology.org/icml/2007/liao2007icml-quadratically/) doi:10.1145/1273496.1273566

BibTeX

@inproceedings{liao2007icml-quadratically,
  title     = {{Quadratically Gated Mixture of Experts for Incomplete Data Classification}},
  author    = {Liao, Xuejun and Li, Hui and Carin, Lawrence},
  booktitle = {International Conference on Machine Learning},
  year      = {2007},
  pages     = {553-560},
  doi       = {10.1145/1273496.1273566},
  url       = {https://mlanthology.org/icml/2007/liao2007icml-quadratically/}
}