High Dimensional Bayesian Optimization Using Dropout
Abstract
Scaling Bayesian optimization to high dimensions is challenging task as the global optimization of high-dimensional acquisition function can be expensive and often infeasible. Existing methods depend either on limited active variables or the additive form of the objective function. We propose a new method for high-dimensional Bayesian optimization, that uses a dropout strategy to optimize only a subset of variables at each iteration. We derive theoretical bounds for the regret and show how it can inform the derivation of our algorithm. We demonstrate the efficacy of our algorithms for optimization on two benchmark functions and two real-world applications- training cascade classifiers and optimizing alloy composition.
Cite
Text
Li et al. "High Dimensional Bayesian Optimization Using Dropout." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/291Markdown
[Li et al. "High Dimensional Bayesian Optimization Using Dropout." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/li2017ijcai-high/) doi:10.24963/IJCAI.2017/291BibTeX
@inproceedings{li2017ijcai-high,
title = {{High Dimensional Bayesian Optimization Using Dropout}},
author = {Li, Cheng and Gupta, Sunil and Rana, Santu and Nguyen, Vu and Venkatesh, Svetha and Shilton, Alistair},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2017},
pages = {2096-2102},
doi = {10.24963/IJCAI.2017/291},
url = {https://mlanthology.org/ijcai/2017/li2017ijcai-high/}
}