Active Learning for Smooth Problems

Abstract

Recently it was shown that the true sample complexity of active learning is asymptotically better than that for passive learning. In many interesting cases, the improvement has been shown to be exponential [BHW08]; however, there are artificial examples in which the improvement is small. In this paper we provide a basis for a exponential improvements in active learning. We show that exponential improvements arise when the underlying learning problem is "smooth," i.e., the hypothesis class, the instance space and the distribution can all be described by smooth functions. This provides a unified and simplified analysis for most known examples and significantly extends the class learning problems that are "actively learnable at an exponential rate."

Cite

Text

Friedman. "Active Learning for Smooth Problems." Annual Conference on Computational Learning Theory, 2009.

Markdown

[Friedman. "Active Learning for Smooth Problems." Annual Conference on Computational Learning Theory, 2009.](https://mlanthology.org/colt/2009/friedman2009colt-active/)

BibTeX

@inproceedings{friedman2009colt-active,
  title     = {{Active Learning for Smooth Problems}},
  author    = {Friedman, Eric},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2009},
  url       = {https://mlanthology.org/colt/2009/friedman2009colt-active/}
}