Heteroscedastic Sequences: Beyond Gaussianity
Abstract
We address the problem of sequential prediction in the heteroscedastic setting, when both the signal and its variance are assumed to depend on explanatory variables. By applying regret minimization techniques, we devise an efficient online learning algorithm for the problem, without assuming that the error terms comply with a specific distribution. We show that our algorithm can be adjusted to provide confidence bounds for its predictions, and provide an application to ARCH models. The theoretic results are corroborated by an empirical study.
Cite
Text
Anava and Mannor. "Heteroscedastic Sequences: Beyond Gaussianity." International Conference on Machine Learning, 2016.Markdown
[Anava and Mannor. "Heteroscedastic Sequences: Beyond Gaussianity." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/anava2016icml-heteroscedastic/)BibTeX
@inproceedings{anava2016icml-heteroscedastic,
title = {{Heteroscedastic Sequences: Beyond Gaussianity}},
author = {Anava, Oren and Mannor, Shie},
booktitle = {International Conference on Machine Learning},
year = {2016},
pages = {755-763},
volume = {48},
url = {https://mlanthology.org/icml/2016/anava2016icml-heteroscedastic/}
}