Hyperparameter Transfer Learning with Adaptive Complexity
Abstract
Bayesian optimization (BO) is a data-efficient approach to automatically tune the hyperparameters of machine learning models. In practice, one frequently has to solve similar hyperparameter tuning problems sequentially. For example, one might have to tune a type of neural network learned across a series of different classification problems. Recent work on multi-task BO exploits knowledge gained from previous hyperparameter tuning tasks to speed up a new tuning task. However, previous approaches do not account for the fact that BO is a sequential decision making procedure. Hence, there is in general a mismatch between the number of evaluations collected in the current tuning task compared to the number of evaluations accumulated in all previously completed tasks. In this work, we enable multi-task BO to compensate for this mismatch, such that the transfer learning procedure is able to handle different data regimes in a principled way. We propose a new multi-task BO method that learns a set of ordered, non-linear basis functions of increasing complexity via nested drop-out and automatic relevance determination. Experiments on a variety of hyperparameter tuning problems show that our method improves the sample efficiency of recently published multi-task BO methods.
Cite
Text
Horváth et al. "Hyperparameter Transfer Learning with Adaptive Complexity." Artificial Intelligence and Statistics, 2021.Markdown
[Horváth et al. "Hyperparameter Transfer Learning with Adaptive Complexity." Artificial Intelligence and Statistics, 2021.](https://mlanthology.org/aistats/2021/horvath2021aistats-hyperparameter/)BibTeX
@inproceedings{horvath2021aistats-hyperparameter,
title = {{Hyperparameter Transfer Learning with Adaptive Complexity}},
author = {Horváth, Samuel and Klein, Aaron and Richtarik, Peter and Archambeau, Cedric},
booktitle = {Artificial Intelligence and Statistics},
year = {2021},
pages = {1378-1386},
volume = {130},
url = {https://mlanthology.org/aistats/2021/horvath2021aistats-hyperparameter/}
}