The ALCHEmist: Automated Labeling 500x CHEaper than LLM Data Annotators
Abstract
Large pretrained models can be used as annotators, helping replace or augment crowdworkers and enabling distilling generalist models into smaller specialist models. Unfortunately, this comes at a cost: employing top-of-the-line models often requires paying thousands of dollars for API calls, while the resulting datasets are static and challenging to audit. To address these challenges, we propose a simple alternative: rather than directly querying labels from pretrained models, we task models to generate programs that can produce labels. These programs can be stored and applied locally, re-used and extended, and cost orders of magnitude less. Our system, $\textbf{Alchemist}$, obtains comparable to or better performance than large language model-based annotation in a range of tasks for a fraction of the cost: on average, improvements amount to a $\textbf{12.9}$% enhancement while the total labeling costs across all datasets are reduced by a factor of approximately $\textbf{500}\times$.
Cite
Text
Huang et al. "The ALCHEmist: Automated Labeling 500x CHEaper than LLM Data Annotators." Neural Information Processing Systems, 2024. doi:10.52202/079017-2003Markdown
[Huang et al. "The ALCHEmist: Automated Labeling 500x CHEaper than LLM Data Annotators." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/huang2024neurips-alchemist/) doi:10.52202/079017-2003BibTeX
@inproceedings{huang2024neurips-alchemist,
title = {{The ALCHEmist: Automated Labeling 500x CHEaper than LLM Data Annotators}},
author = {Huang, Tzu-Heng and Cao, Catherine and Bhargava, Vaishnavi and Sala, Frederic},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-2003},
url = {https://mlanthology.org/neurips/2024/huang2024neurips-alchemist/}
}