An Information-Theoretic Framework for Unifying Active Learning Problems
Abstract
This paper presents an information-theoretic framework for unifying active learning problems: level set estimation (LSE), Bayesian optimization (BO), and their generalized variant. We first introduce a novel active learning criterion that subsumes an existing LSE algorithm and achieves state-of-the-art performance in LSE problems with a continuous input domain. Then, by exploiting the relationship between LSE and BO, we design a competitive information-theoretic acquisition function for BO that has interesting connections to upper confidence bound and max-value entropy search (MES). The latter connection reveals a drawback of MES which has important implications on not only MES but also on other MES-based acquisition functions. Finally, our unifying information-theoretic framework can be applied to solve a generalized problem of LSE and BO involving multiple level sets in a data-efficient manner. We empirically evaluate the performance of our proposed algorithms using synthetic benchmark functions, a real-world dataset, and in hyperparameter tuning of machine learning models.
Cite
Text
Nguyen et al. "An Information-Theoretic Framework for Unifying Active Learning Problems." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I10.17102Markdown
[Nguyen et al. "An Information-Theoretic Framework for Unifying Active Learning Problems." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/nguyen2021aaai-information/) doi:10.1609/AAAI.V35I10.17102BibTeX
@inproceedings{nguyen2021aaai-information,
title = {{An Information-Theoretic Framework for Unifying Active Learning Problems}},
author = {Nguyen, Quoc Phong and Low, Bryan Kian Hsiang and Jaillet, Patrick},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2021},
pages = {9126-9134},
doi = {10.1609/AAAI.V35I10.17102},
url = {https://mlanthology.org/aaai/2021/nguyen2021aaai-information/}
}