A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates

Abstract

We consider the mixed regression problem with two components, under adversarial and stochastic noise. We give a convex optimization formulation that provably recovers the true solution, and provide upper bounds on the recovery errors for both arbitrary noise and stochastic noise settings. We also give matching minimax lower bounds (up to log factors), showing that under certain assumptions, our algorithm is information-theoretically optimal. Our results represent the first (and currently only known) tractable algorithm guaranteeing successful recovery with tight bounds on recovery errors and sample complexity.

Cite

Text

Chen et al. "A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates." Annual Conference on Computational Learning Theory, 2014.

Markdown

[Chen et al. "A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates." Annual Conference on Computational Learning Theory, 2014.](https://mlanthology.org/colt/2014/chen2014colt-convex/)

BibTeX

@inproceedings{chen2014colt-convex,
  title     = {{A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates}},
  author    = {Chen, Yudong and Yi, Xinyang and Caramanis, Constantine},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2014},
  pages     = {560-604},
  url       = {https://mlanthology.org/colt/2014/chen2014colt-convex/}
}