Online Linearized LASSO
Abstract
Sparse regression has been a popular approach to perform variable selection and enhance the prediction accuracy and interpretability of the resulting statistical model. Existing approaches focus on offline regularized regression, while the online scenario has rarely been studied. In this paper, we propose a novel online sparse linear regression framework for analyzing streaming data when data points arrive sequentially. Our proposed method is memory efficient and requires less stringent restricted strong convexity assumptions. Theoretically, we show that with a properly chosen regularization parameter, the $\ell_2$-error of our estimator decays to zero at the optimal order of $\tilde \mathcal{O}(\frac{s}{\sqrt{t}})$, where $s$ is the sparsity level, $t$ is the streaming sample size, and $\tilde \mathcal{O}(\cdot)$ hides logarithmic terms. Numerical experiments demonstrate the practical efficiency of our algorithm.
Cite
Text
Yang et al. "Online Linearized LASSO." Artificial Intelligence and Statistics, 2023.Markdown
[Yang et al. "Online Linearized LASSO." Artificial Intelligence and Statistics, 2023.](https://mlanthology.org/aistats/2023/yang2023aistats-online/)BibTeX
@inproceedings{yang2023aistats-online,
title = {{Online Linearized LASSO}},
author = {Yang, Shuoguang and Yan, Yuhao and Zhu, Xiuneng and Sun, Qiang},
booktitle = {Artificial Intelligence and Statistics},
year = {2023},
pages = {7594-7610},
volume = {206},
url = {https://mlanthology.org/aistats/2023/yang2023aistats-online/}
}