Open Problem: Efficient Online Sparse Regression
Abstract
In practical scenarios, it is often necessary to be able to make predictions with very limited access to the features of any example. We provide one natural formulation as an online sparse regression problem with squared loss, and ask whether it is possible to achieve sublinear regret with efficient algorithms (i.e. polynomial running time in the natural parameters of the problem).
Cite
Text
Kale. "Open Problem: Efficient Online Sparse Regression." Annual Conference on Computational Learning Theory, 2014.Markdown
[Kale. "Open Problem: Efficient Online Sparse Regression." Annual Conference on Computational Learning Theory, 2014.](https://mlanthology.org/colt/2014/kale2014colt-open/)BibTeX
@inproceedings{kale2014colt-open,
title = {{Open Problem: Efficient Online Sparse Regression}},
author = {Kale, Satyen},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2014},
pages = {1299-1301},
url = {https://mlanthology.org/colt/2014/kale2014colt-open/}
}