Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds

Abstract

We design a new algorithm for batch active learning with deep neural network models. Our algorithm, Batch Active learning by Diverse Gradient Embeddings (BADGE), samples groups of points that are disparate and high-magnitude when represented in a hallucinated gradient space, a strategy designed to incorporate both predictive uncertainty and sample diversity into every selected batch. Crucially, BADGE trades off between diversity and uncertainty without requiring any hand-tuned hyperparameters. While other approaches sometimes succeed for particular batch sizes or architectures, BADGE consistently performs as well or better, making it a useful option for real world active learning problems.

Cite

Text

Ash et al. "Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds." International Conference on Learning Representations, 2020.

Markdown

[Ash et al. "Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds." International Conference on Learning Representations, 2020.](https://mlanthology.org/iclr/2020/ash2020iclr-deep/)

BibTeX

@inproceedings{ash2020iclr-deep,
  title     = {{Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds}},
  author    = {Ash, Jordan T. and Zhang, Chicheng and Krishnamurthy, Akshay and Langford, John and Agarwal, Alekh},
  booktitle = {International Conference on Learning Representations},
  year      = {2020},
  url       = {https://mlanthology.org/iclr/2020/ash2020iclr-deep/}
}