Shifting Regret, Mirror Descent, and Matrices

Abstract

We consider the problem of online prediction in changing environments. In this framework the performance of a predictor is evaluated as the loss relative to an arbitrarily changing predictor, whose individual components come from a base class of predictors. Typical results in the literature consider different base classes (experts, linear predictors on the simplex, etc.) separately. Introducing an arbitrary mapping inside the mirror decent algorithm, we provide a framework that unifies and extends existing results. As an example, we prove new shifting regret bounds for matrix prediction problems.

Cite

Text

Gyorgy and Szepesvari. "Shifting Regret, Mirror Descent, and Matrices." International Conference on Machine Learning, 2016.

Markdown

[Gyorgy and Szepesvari. "Shifting Regret, Mirror Descent, and Matrices." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/gyorgy2016icml-shifting/)

BibTeX

@inproceedings{gyorgy2016icml-shifting,
  title     = {{Shifting Regret, Mirror Descent, and Matrices}},
  author    = {Gyorgy, Andras and Szepesvari, Csaba},
  booktitle = {International Conference on Machine Learning},
  year      = {2016},
  pages     = {2943-2951},
  volume    = {48},
  url       = {https://mlanthology.org/icml/2016/gyorgy2016icml-shifting/}
}