DEHB: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization

Abstract

Modern machine learning algorithms crucially rely on several design decisions to achieve strong performance, making the problem of Hyperparameter Optimization (HPO) more important than ever. Here, we combine the advantages of the popular bandit-based HPO method Hyperband (HB) and the evolutionary search approach of Differential Evolution (DE) to yield a new HPO method which we call DEHB. Comprehensive results on a very broad range of HPO problems, as well as a wide range of tabular benchmarks from neural architecture search, demonstrate that DEHB achieves strong performance far more robustly than all previous HPO methods we are aware of, especially for high-dimensional problems with discrete input dimensions. For example, DEHB is up to 1000x faster than random search. It is also efficient in computational time, conceptually simple and easy to implement, positioning it well to become a new default HPO method.

Cite

Text

Awad et al. "DEHB: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization." International Joint Conference on Artificial Intelligence, 2021. doi:10.24963/IJCAI.2021/296

Markdown

[Awad et al. "DEHB: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization." International Joint Conference on Artificial Intelligence, 2021.](https://mlanthology.org/ijcai/2021/awad2021ijcai-dehb/) doi:10.24963/IJCAI.2021/296

BibTeX

@inproceedings{awad2021ijcai-dehb,
  title     = {{DEHB: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization}},
  author    = {Awad, Noor H. and Mallik, Neeratyoy and Hutter, Frank},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {2147-2153},
  doi       = {10.24963/IJCAI.2021/296},
  url       = {https://mlanthology.org/ijcai/2021/awad2021ijcai-dehb/}
}