Learning Mixtures of Linear Classifiers

Abstract

We consider a discriminative learning (regression) problem, whereby the regression function is a convex combination of k linear classifiers. Existing approaches are based on the EM algorithm, or similar techniques, without provable guarantees. We develop a simple method based on spectral techniques and a ‘mirroring’ trick, that discovers the subspace spanned by the classifiers’ parameter vectors. Under a probabilistic assumption on the feature vector distribution, we prove that this approach has nearly optimal statistical efficiency.

Cite

Text

Sun et al. "Learning Mixtures of Linear Classifiers." International Conference on Machine Learning, 2014.

Markdown

[Sun et al. "Learning Mixtures of Linear Classifiers." International Conference on Machine Learning, 2014.](https://mlanthology.org/icml/2014/sun2014icml-learning/)

BibTeX

@inproceedings{sun2014icml-learning,
  title     = {{Learning Mixtures of Linear Classifiers}},
  author    = {Sun, Yuekai and Ioannidis, Stratis and Montanari, Andrea},
  booktitle = {International Conference on Machine Learning},
  year      = {2014},
  pages     = {721-729},
  volume    = {32},
  url       = {https://mlanthology.org/icml/2014/sun2014icml-learning/}
}