Hyperband: Bandit-Based Configuration Evaluation for Hyperparameter Optimization

Abstract

Performance of machine learning algorithms depends critically on identifying a good set of hyperparameters. While recent approaches use Bayesian Optimization to adaptively select configurations, we focus on speeding up random search through adaptive resource allocation. We present Hyperband, a novel algorithm for hyperparameter optimization that is simple, flexible, and theoretically sound. Hyperband is a principled early-stoppping method that adaptively allocates a predefined resource, e.g., iterations, data samples or number of features, to randomly sampled configurations. We compare Hyperband with state-of-the-art Bayesian Optimization methods on several hyperparameter optimization problems. We observe that Hyperband can provide over an order of magnitude speedups over competitors on a variety of neural network and kernel-based learning problems.

Cite

Text

Li et al. "Hyperband: Bandit-Based Configuration Evaluation for Hyperparameter Optimization." International Conference on Learning Representations, 2017.

Markdown

[Li et al. "Hyperband: Bandit-Based Configuration Evaluation for Hyperparameter Optimization." International Conference on Learning Representations, 2017.](https://mlanthology.org/iclr/2017/li2017iclr-hyperband/)

BibTeX

@inproceedings{li2017iclr-hyperband,
  title     = {{Hyperband: Bandit-Based Configuration Evaluation for Hyperparameter Optimization}},
  author    = {Li, Lisha and Jamieson, Kevin G. and DeSalvo, Giulia and Rostamizadeh, Afshin and Talwalkar, Ameet},
  booktitle = {International Conference on Learning Representations},
  year      = {2017},
  url       = {https://mlanthology.org/iclr/2017/li2017iclr-hyperband/}
}