Active Labeling: Streaming Stochastic Gradients
Abstract
The workhorse of machine learning is stochastic gradient descent.To access stochastic gradients, it is common to consider iteratively input/output pairs of a training dataset.Interestingly, it appears that one does not need full supervision to access stochastic gradients, which is the main motivation of this paper.After formalizing the "active labeling" problem, which focuses on active learning with partial supervision, we provide a streaming technique that provably minimizes the ratio of generalization error over the number of samples.We illustrate our technique in depth for robust regression.
Cite
Text
Cabannes et al. "Active Labeling: Streaming Stochastic Gradients." Neural Information Processing Systems, 2022.Markdown
[Cabannes et al. "Active Labeling: Streaming Stochastic Gradients." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/cabannes2022neurips-active/)BibTeX
@inproceedings{cabannes2022neurips-active,
title = {{Active Labeling: Streaming Stochastic Gradients}},
author = {Cabannes, Vivien and Bach, Francis R. and Perchet, Vianney and Rudi, Alessandro},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/cabannes2022neurips-active/}
}