Active Covering

Abstract

We analyze the problem of active covering, where the learner is given an unlabeled dataset and can sequentially label query examples. The objective is to label query all of the positive examples in the fewest number of total label queries. We show under standard non-parametric assumptions that a classical support estimator can be repurposed as an offline algorithm attaining an excess query cost of $\widetilde{\Theta}(n^{D/(D+1)})$ compared to the optimal learner, where $n$ is the number of datapoints and $D$ is the dimension. We then provide a simple active learning method that attains an improved excess query cost of $\widetilde{O}(n^{(D-1)/D})$. Furthermore, the proposed algorithms only require access to the positive labeled examples, which in certain settings provides additional computational and privacy benefits. Finally, we show that the active learning method consistently outperforms offline methods as well as a variety of baselines on a wide range of benchmark image-based datasets.

Cite

Text

Jiang and Rostamizadeh. "Active Covering." International Conference on Machine Learning, 2021.

Markdown

[Jiang and Rostamizadeh. "Active Covering." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/jiang2021icml-active/)

BibTeX

@inproceedings{jiang2021icml-active,
  title     = {{Active Covering}},
  author    = {Jiang, Heinrich and Rostamizadeh, Afshin},
  booktitle = {International Conference on Machine Learning},
  year      = {2021},
  pages     = {5013-5022},
  volume    = {139},
  url       = {https://mlanthology.org/icml/2021/jiang2021icml-active/}
}