PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning
Abstract
Hyperparameters of Deep Learning (DL) pipelines are crucial for their downstream performance. While a large number of methods for Hyperparameter Optimization (HPO) have been developed, their incurred costs are often untenable for modern DL.Consequently, manual experimentation is still the most prevalent approach to optimize hyperparameters, relying on the researcher's intuition, domain knowledge, and cheap preliminary explorations.To resolve this misalignment between HPO algorithms and DL researchers, we propose PriorBand, an HPO algorithm tailored to DL, able to utilize both expert beliefs and cheap proxy tasks. Empirically, we demonstrate PriorBand's efficiency across a range of DL benchmarks and show its gains under informative expert input and robustness against poor expert beliefs.
Cite
Text
Mallik et al. "PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning." Neural Information Processing Systems, 2023.Markdown
[Mallik et al. "PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/mallik2023neurips-priorband/)BibTeX
@inproceedings{mallik2023neurips-priorband,
title = {{PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning}},
author = {Mallik, Neeratyoy and Bergman, Edward and Hvarfner, Carl and Stoll, Danny and Janowski, Maciej and Lindauer, Marius and Nardi, Luigi and Hutter, Frank},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/mallik2023neurips-priorband/}
}