Importance Weighted Active Learning
Abstract
We present a practical and statistically consistent scheme for actively learning binary classifiers under general loss functions. Our algorithm uses importance weighting to correct sampling bias, and by controlling the variance, we are able to give rigorous label complexity bounds for the learning process.
Cite
Text
Beygelzimer et al. "Importance Weighted Active Learning." International Conference on Machine Learning, 2009. doi:10.1145/1553374.1553381Markdown
[Beygelzimer et al. "Importance Weighted Active Learning." International Conference on Machine Learning, 2009.](https://mlanthology.org/icml/2009/beygelzimer2009icml-importance/) doi:10.1145/1553374.1553381BibTeX
@inproceedings{beygelzimer2009icml-importance,
title = {{Importance Weighted Active Learning}},
author = {Beygelzimer, Alina and Dasgupta, Sanjoy and Langford, John},
booktitle = {International Conference on Machine Learning},
year = {2009},
pages = {49-56},
doi = {10.1145/1553374.1553381},
url = {https://mlanthology.org/icml/2009/beygelzimer2009icml-importance/}
}