Space Lower Bounds for Linear Prediction in the Streaming Model

Abstract

We show that fundamental learning tasks, such as finding an approximate linear separator or linear regression, require memory at least \emph{quadratic} in the dimension, in a natural streaming setting. This implies that such problems cannot be solved (at least in this setting) by scalable memory-efficient streaming algorithms. Our results build on a memory lower bound for a simple linear-algebraic problem – finding approximate null vectors – and utilize the estimates on the packing of the Grassmannian, the manifold of all linear subspaces of fixed dimension.

Cite

Text

Dagan et al. "Space Lower Bounds for Linear Prediction in the Streaming Model." Conference on Learning Theory, 2019.

Markdown

[Dagan et al. "Space Lower Bounds for Linear Prediction in the Streaming Model." Conference on Learning Theory, 2019.](https://mlanthology.org/colt/2019/dagan2019colt-space/)

BibTeX

@inproceedings{dagan2019colt-space,
  title     = {{Space Lower Bounds for Linear Prediction in the Streaming Model}},
  author    = {Dagan, Yuval and Kur, Gil and Shamir, Ohad},
  booktitle = {Conference on Learning Theory},
  year      = {2019},
  pages     = {929-954},
  volume    = {99},
  url       = {https://mlanthology.org/colt/2019/dagan2019colt-space/}
}