Classifying with Gaussian Mixtures and Clusters

Abstract

In this paper, we derive classifiers which are winner-take-all (WTA) approximations to a Bayes classifier with Gaussian mixtures for class conditional densities. The derived classifiers include clustering based algorithms like LVQ and k-Means. We propose a constrained rank Gaussian mixtures model and derive a WTA algorithm for it. Our experiments with two speech classification tasks indicate that the constrained rank model and the WTA approximations improve the performance over the unconstrained models.

Cite

Text

Kambhatla and Leen. "Classifying with Gaussian Mixtures and Clusters." Neural Information Processing Systems, 1994.

Markdown

[Kambhatla and Leen. "Classifying with Gaussian Mixtures and Clusters." Neural Information Processing Systems, 1994.](https://mlanthology.org/neurips/1994/kambhatla1994neurips-classifying/)

BibTeX

@inproceedings{kambhatla1994neurips-classifying,
  title     = {{Classifying with Gaussian Mixtures and Clusters}},
  author    = {Kambhatla, Nanda and Leen, Todd K.},
  booktitle = {Neural Information Processing Systems},
  year      = {1994},
  pages     = {681-688},
  url       = {https://mlanthology.org/neurips/1994/kambhatla1994neurips-classifying/}
}