Robust Regression and Lasso
Abstract
We consider robust least-squares regression with feature-wise disturbance. We show that this formulation leads to tractable convex optimization problems, and we exhibit a particular uncertainty set for which the robust problem is equivalent to $\ell_1$ regularized regression (Lasso). This provides an interpretation of Lasso from a robust optimization perspective. We generalize this robust formulation to consider more general uncertainty sets, which all lead to tractable convex optimization problems. Therefore, we provide a new methodology for designing regression algorithms, which generalize known formulations. The advantage is that robustness to disturbance is a physical property that can be exploited: in addition to obtaining new formulations, we use it directly to show sparsity properties of Lasso, as well as to prove a general consistency result for robust regression problems, including Lasso, from a unified robustness perspective.
Cite
Text
Xu et al. "Robust Regression and Lasso." Neural Information Processing Systems, 2008.Markdown
[Xu et al. "Robust Regression and Lasso." Neural Information Processing Systems, 2008.](https://mlanthology.org/neurips/2008/xu2008neurips-robust/)BibTeX
@inproceedings{xu2008neurips-robust,
title = {{Robust Regression and Lasso}},
author = {Xu, Huan and Caramanis, Constantine and Mannor, Shie},
booktitle = {Neural Information Processing Systems},
year = {2008},
pages = {1801-1808},
url = {https://mlanthology.org/neurips/2008/xu2008neurips-robust/}
}