Memory, Communication, and Statistical Queries
Abstract
If a concept class can be represented with a certain amount of memory, can it be efficiently learned with the same amount of memory? What concepts can be efficiently learned by algorithms that extract only a few bits of information from each example? We introduce a formal framework for studying these questions, and investigate the relationship between the fundamental resources of memory or communication and the sample complexity of the learning task. We relate our memory-bounded and communication-bounded learning models to the well-studied statistical query model. This connection can be leveraged to obtain both upper and lower bounds: we show strong lower bounds on learning parity functions with bounded communication, as well as upper bounds on solving sparse linear regression problems with limited memory.
Cite
Text
Steinhardt et al. "Memory, Communication, and Statistical Queries." Annual Conference on Computational Learning Theory, 2016.Markdown
[Steinhardt et al. "Memory, Communication, and Statistical Queries." Annual Conference on Computational Learning Theory, 2016.](https://mlanthology.org/colt/2016/steinhardt2016colt-memory/)BibTeX
@inproceedings{steinhardt2016colt-memory,
title = {{Memory, Communication, and Statistical Queries}},
author = {Steinhardt, Jacob and Valiant, Gregory and Wager, Stefan},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2016},
pages = {1490-1516},
url = {https://mlanthology.org/colt/2016/steinhardt2016colt-memory/}
}