Stochastic Batch Acquisition: A Simple Baseline for Deep Active Learning
Abstract
We examine a simple stochastic strategy for adapting well-known single-point acquisition functions to allow batch active learning. Unlike acquiring the top-K points from the pool set, score- or rank-based sampling takes into account that acquisition scores change as new data are acquired. This simple strategy for adapting standard single-sample acquisition strategies can even perform just as well as compute-intensive state-of-the-art batch acquisition functions, like BatchBALD or BADGE while using orders of magnitude less compute. In addition to providing a practical option for machine learning practitioners, the surprising success of the proposed method in a wide range of experimental settings raises a difficult question for the field: when are these expensive batch acquisition methods pulling their weight?
Cite
Text
Kirsch et al. "Stochastic Batch Acquisition: A Simple Baseline for Deep Active Learning." Transactions on Machine Learning Research, 2023.Markdown
[Kirsch et al. "Stochastic Batch Acquisition: A Simple Baseline for Deep Active Learning." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/kirsch2023tmlr-stochastic/)BibTeX
@article{kirsch2023tmlr-stochastic,
title = {{Stochastic Batch Acquisition: A Simple Baseline for Deep Active Learning}},
author = {Kirsch, Andreas and Farquhar, Sebastian and Atighehchian, Parmida and Jesson, Andrew and Branchaud-Charron, Frédéric and Gal, Yarin},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/kirsch2023tmlr-stochastic/}
}