Nonparametric Online Regression While Learning the Metric
Abstract
We study algorithms for online nonparametric regression that learn the directions along which the regression function is smoother. Our algorithm learns the Mahalanobis metric based on the gradient outer product matrix $\boldsymbol{G}$ of the regression function (automatically adapting to the effective rank of this matrix), while simultaneously bounding the regret ---on the same data sequence--- in terms of the spectrum of $\boldsymbol{G}$. As a preliminary step in our analysis, we extend a nonparametric online learning algorithm by Hazan and Megiddo enabling it to compete against functions whose Lipschitzness is measured with respect to an arbitrary Mahalanobis metric.
Cite
Text
Kuzborskij and Cesa-Bianchi. "Nonparametric Online Regression While Learning the Metric." Neural Information Processing Systems, 2017.Markdown
[Kuzborskij and Cesa-Bianchi. "Nonparametric Online Regression While Learning the Metric." Neural Information Processing Systems, 2017.](https://mlanthology.org/neurips/2017/kuzborskij2017neurips-nonparametric/)BibTeX
@inproceedings{kuzborskij2017neurips-nonparametric,
title = {{Nonparametric Online Regression While Learning the Metric}},
author = {Kuzborskij, Ilja and Cesa-Bianchi, Nicolò},
booktitle = {Neural Information Processing Systems},
year = {2017},
pages = {667-676},
url = {https://mlanthology.org/neurips/2017/kuzborskij2017neurips-nonparametric/}
}