Regression-Tree Tuning in a Streaming Setting
Abstract
We consider the problem of maintaining the data-structures of a partition-based regression procedure in a setting where the training data arrives sequentially over time. We prove that it is possible to maintain such a structure in time $O(\log n)$ at any time step $n$ while achieving a nearly-optimal regression rate of $\tilde{O}(n^{-2/(2+d)})$ in terms of the unknown metric dimension $d$. Finally we prove a new regression lower-bound which is independent of a given data size, and hence is more appropriate for the streaming setting.
Cite
Text
Kpotufe and Orabona. "Regression-Tree Tuning in a Streaming Setting." Neural Information Processing Systems, 2013.Markdown
[Kpotufe and Orabona. "Regression-Tree Tuning in a Streaming Setting." Neural Information Processing Systems, 2013.](https://mlanthology.org/neurips/2013/kpotufe2013neurips-regressiontree/)BibTeX
@inproceedings{kpotufe2013neurips-regressiontree,
title = {{Regression-Tree Tuning in a Streaming Setting}},
author = {Kpotufe, Samory and Orabona, Francesco},
booktitle = {Neural Information Processing Systems},
year = {2013},
pages = {1788-1796},
url = {https://mlanthology.org/neurips/2013/kpotufe2013neurips-regressiontree/}
}