Agnostic Online Learning

Abstract

We study learnability of hypotheses classes in agnostic online prediction models. The analogous question in the PAC learning model [Valiant, 1984] was addressed by Haussler [1992] and others, who showed that the VC dimension characterization of the sample complexity of learnability extends to the agnostic (or "unrealizable") setting. In his influential work, Littlestone [1988] described a combinatorial characterization of hypothesis classes that are learnable in the online model. We extend Littlestone's results in two aspects. First, while Littlestone only dealt with the realizable case, namely, assuming there exists a hypothesis in the class that perfectly explains the entire data, we derive results for the non-realizable (agnostic) case as well. In particular, we describe several models of non-realizable data and derive upper and lower bounds on the achievable regret. Second, we extend the theory to include margin-based hypothesis classes, in which the prediction of each hypothesis is accompanied by a confidence value. We demonstrate how the newly developed theory seamlessly yields novel online regret bounds for the important class of large margin linear separators.

Cite

Text

Ben-David et al. "Agnostic Online Learning." Annual Conference on Computational Learning Theory, 2009.

Markdown

[Ben-David et al. "Agnostic Online Learning." Annual Conference on Computational Learning Theory, 2009.](https://mlanthology.org/colt/2009/bendavid2009colt-agnostic/)

BibTeX

@inproceedings{bendavid2009colt-agnostic,
  title     = {{Agnostic Online Learning}},
  author    = {Ben-David, Shai and Pál, Dávid and Shalev-Shwartz, Shai},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2009},
  url       = {https://mlanthology.org/colt/2009/bendavid2009colt-agnostic/}
}