An Interior-Point Method for Large-Scale L1-Regularized Logistic Regression
Abstract
Logistic regression with l1 regularization has been proposed as a promising method for feature selection in classification problems. In this paper we describe an efficient interior-point method for solving large-scale l1-regularized logistic regression problems. Small problems with up to a thousand or so features and examples can be solved in seconds on a PC; medium sized problems, with tens of thousands of features and examples, can be solved in tens of seconds (assuming some sparsity in the data). A variation on the basic method, that uses a preconditioned conjugate gradient method to compute the search step, can solve very large problems, with a million features and examples (e.g., the 20 Newsgroups data set), in a few minutes, on a PC. Using warm-start techniques, a good approximation of the entire regularization path can be computed much more efficiently than by solving a family of problems independently.
Cite
Text
Koh et al. "An Interior-Point Method for Large-Scale L1-Regularized Logistic Regression." Journal of Machine Learning Research, 2007.Markdown
[Koh et al. "An Interior-Point Method for Large-Scale L1-Regularized Logistic Regression." Journal of Machine Learning Research, 2007.](https://mlanthology.org/jmlr/2007/koh2007jmlr-interiorpoint/)BibTeX
@article{koh2007jmlr-interiorpoint,
title = {{An Interior-Point Method for Large-Scale L1-Regularized Logistic Regression}},
author = {Koh, Kwangmoo and Kim, Seung-Jean and Boyd, Stephen},
journal = {Journal of Machine Learning Research},
year = {2007},
pages = {1519-1555},
volume = {8},
url = {https://mlanthology.org/jmlr/2007/koh2007jmlr-interiorpoint/}
}