Parameter-Free Online Learning via Model Selection
Abstract
We introduce an efficient algorithmic framework for model selection in online learning, also known as parameter-free online learning. Departing from previous work, which has focused on highly structured function classes such as nested balls in Hilbert space, we propose a generic meta-algorithm framework that achieves online model selection oracle inequalities under minimal structural assumptions. We give the first computationally efficient parameter-free algorithms that work in arbitrary Banach spaces under mild smoothness assumptions; previous results applied only to Hilbert spaces. We further derive new oracle inequalities for matrix classes, non-nested convex sets, and $\mathbb{R}^{d}$ with generic regularizers. Finally, we generalize these results by providing oracle inequalities for arbitrary non-linear classes in the online supervised learning model. These results are all derived through a unified meta-algorithm scheme using a novel "multi-scale" algorithm for prediction with expert advice based on random playout, which may be of independent interest.
Cite
Text
Foster et al. "Parameter-Free Online Learning via Model Selection." Neural Information Processing Systems, 2017.Markdown
[Foster et al. "Parameter-Free Online Learning via Model Selection." Neural Information Processing Systems, 2017.](https://mlanthology.org/neurips/2017/foster2017neurips-parameterfree/)BibTeX
@inproceedings{foster2017neurips-parameterfree,
title = {{Parameter-Free Online Learning via Model Selection}},
author = {Foster, Dylan J and Kale, Satyen and Mohri, Mehryar and Sridharan, Karthik},
booktitle = {Neural Information Processing Systems},
year = {2017},
pages = {6020-6030},
url = {https://mlanthology.org/neurips/2017/foster2017neurips-parameterfree/}
}