Learnability and Stability in the General Learning Setting

Abstract

We establish that stability is necessary and sufficient for learning, even in the General Learning Setting where uniform convergence conditions are not necessary for learning, and where learning might only be possible with a non-ERM learning rule. This goes beyond previous work on the relationship between stability and learnability, which focused on supervised classification and regression, where learnability is equivalent to uniform convergence and it is enough to consider the ERM.

Cite

Text

Shalev-Shwartz et al. "Learnability and Stability in the General Learning Setting." Annual Conference on Computational Learning Theory, 2009.

Markdown

[Shalev-Shwartz et al. "Learnability and Stability in the General Learning Setting." Annual Conference on Computational Learning Theory, 2009.](https://mlanthology.org/colt/2009/shalevshwartz2009colt-learnability/)

BibTeX

@inproceedings{shalevshwartz2009colt-learnability,
  title     = {{Learnability and Stability in the General Learning Setting}},
  author    = {Shalev-Shwartz, Shai and Shamir, Ohad and Srebro, Nathan and Sridharan, Karthik},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2009},
  url       = {https://mlanthology.org/colt/2009/shalevshwartz2009colt-learnability/}
}