Multi-Fidelity Black-Box Optimization with Hierarchical Partitions
Abstract
Motivated by settings such as hyper-parameter tuning and physical simulations, we consider the problem of black-box optimization of a function. Multi-fidelity techniques have become popular for applications where exact function evaluations are expensive, but coarse (biased) approximations are available at much lower cost. A canonical example is that of hyper-parameter selection in a learning algorithm. The learning algorithm can be trained for fewer iterations – this results in a lower cost, but its validation error is only coarsely indicative of the same if the algorithm had been trained till completion. We incorporate the multi-fidelity setup into the powerful framework of black-box optimization through hierarchical partitioning. We develop tree-search based multi-fidelity algorithms with theoretical guarantees on simple regret. We finally demonstrate the performance gains of our algorithms on both real and synthetic datasets.
Cite
Text
Sen et al. "Multi-Fidelity Black-Box Optimization with Hierarchical Partitions." International Conference on Machine Learning, 2018.Markdown
[Sen et al. "Multi-Fidelity Black-Box Optimization with Hierarchical Partitions." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/sen2018icml-multifidelity/)BibTeX
@inproceedings{sen2018icml-multifidelity,
title = {{Multi-Fidelity Black-Box Optimization with Hierarchical Partitions}},
author = {Sen, Rajat and Kandasamy, Kirthevasan and Shakkottai, Sanjay},
booktitle = {International Conference on Machine Learning},
year = {2018},
pages = {4538-4547},
volume = {80},
url = {https://mlanthology.org/icml/2018/sen2018icml-multifidelity/}
}