Efficient Sublinear-Regret Algorithms for Online Sparse Linear Regression with Limited Observation

Abstract

Online sparse linear regression is the task of applying linear regression analysis to examples arriving sequentially subject to a resource constraint that a limited number of features of examples can be observed. Despite its importance in many practical applications, it has been recently shown that there is no polynomial-time sublinear-regret algorithm unless NP$\subseteq$BPP, and only an exponential-time sublinear-regret algorithm has been found. In this paper, we introduce mild assumptions to solve the problem. Under these assumptions, we present polynomial-time sublinear-regret algorithms for the online sparse linear regression. In addition, thorough experiments with publicly available data demonstrate that our algorithms outperform other known algorithms.

Cite

Text

Ito et al. "Efficient Sublinear-Regret Algorithms for Online Sparse Linear Regression with Limited Observation." Neural Information Processing Systems, 2017.

Markdown

[Ito et al. "Efficient Sublinear-Regret Algorithms for Online Sparse Linear Regression with Limited Observation." Neural Information Processing Systems, 2017.](https://mlanthology.org/neurips/2017/ito2017neurips-efficient/)

BibTeX

@inproceedings{ito2017neurips-efficient,
  title     = {{Efficient Sublinear-Regret Algorithms for Online Sparse Linear Regression with Limited Observation}},
  author    = {Ito, Shinji and Hatano, Daisuke and Sumita, Hanna and Yabe, Akihiro and Fukunaga, Takuro and Kakimura, Naonori and Kawarabayashi, Ken-Ichi},
  booktitle = {Neural Information Processing Systems},
  year      = {2017},
  pages     = {4099-4108},
  url       = {https://mlanthology.org/neurips/2017/ito2017neurips-efficient/}
}