A Practical Scheme and Fast Algorithm to Tune the Lasso with Optimality Guarantees

Abstract

We introduce a novel scheme for choosing the regularization parameter in high-dimensional linear regression with Lasso. This scheme, inspired by Lepskiâs method for bandwidth selection in non-parametric regression, is equipped with both optimal finite-sample guarantees and a fast algorithm. In particular, for any design matrix such that the Lasso has low sup-norm error under an âoracle choiceâ of the regularization parameter, we show that our method matches the oracle performance up to a small constant factor, and show that it can be implemented by performing simple tests along a single Lasso path. By applying the Lasso to simulated and real data, we find that our novel scheme can be faster and more accurate than standard schemes such as Cross-Validation.

Cite

Text

Chichignoud et al. "A Practical Scheme and Fast Algorithm to Tune the Lasso with Optimality Guarantees." Journal of Machine Learning Research, 2016.

Markdown

[Chichignoud et al. "A Practical Scheme and Fast Algorithm to Tune the Lasso with Optimality Guarantees." Journal of Machine Learning Research, 2016.](https://mlanthology.org/jmlr/2016/chichignoud2016jmlr-practical/)

BibTeX

@article{chichignoud2016jmlr-practical,
  title     = {{A Practical Scheme and Fast Algorithm to Tune the Lasso with Optimality Guarantees}},
  author    = {Chichignoud, Michael and Lederer, Johannes and Wainwright, Martin J.},
  journal   = {Journal of Machine Learning Research},
  year      = {2016},
  pages     = {1-20},
  volume    = {17},
  url       = {https://mlanthology.org/jmlr/2016/chichignoud2016jmlr-practical/}
}