Minimax Rates for Memory-Bounded Sparse Linear Regression
Abstract
We establish a minimax lower bound of Ω(\frackdBε) on the sample size needed to estimate parameters in a k-sparse linear regression of dimension d under memory restrictions to B bits, where εis the \ell_2 parameter error. When the covariance of the regressors is the identity matrix, we also provide an algorithm that uses \tildeO(B+k) bits and requires \tildeO(\frackdBε^2) observations to achieve error ε. Our lower bound also holds in the more general communication-bounded setting, where instead of a memory bound, at most B bits of information are allowed to be (adaptively) communicated about each sample.
Cite
Text
Steinhardt and Duchi. "Minimax Rates for Memory-Bounded Sparse Linear Regression." Annual Conference on Computational Learning Theory, 2015.Markdown
[Steinhardt and Duchi. "Minimax Rates for Memory-Bounded Sparse Linear Regression." Annual Conference on Computational Learning Theory, 2015.](https://mlanthology.org/colt/2015/steinhardt2015colt-minimax/)BibTeX
@inproceedings{steinhardt2015colt-minimax,
title = {{Minimax Rates for Memory-Bounded Sparse Linear Regression}},
author = {Steinhardt, Jacob and Duchi, John C.},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2015},
pages = {1564-1587},
url = {https://mlanthology.org/colt/2015/steinhardt2015colt-minimax/}
}