Optimistic Optimization of a Deterministic Function Without the Knowledge of Its Smoothness
Abstract
We consider a global optimization problem of a deterministic function f in a semimetric space, given a finite budget of n evaluations. The function f is assumed to be locally smooth (around one of its global maxima) with respect to a semi-metric . We describe two algorithms based on optimistic exploration that use a hierarchical partitioning of the space at all scales. A first contribution is an algorithm, DOO, that requires the knowledge of . We report a finite-sample performance bound in terms of a measure of the quantity of near-optimal states. We then define a second algorithm, SOO, which does not require the knowledge of the semimetric under which f is smooth, and whose performance is almost as good as DOO optimally-fitted.
Cite
Text
Munos. "Optimistic Optimization of a Deterministic Function Without the Knowledge of Its Smoothness." Neural Information Processing Systems, 2011.Markdown
[Munos. "Optimistic Optimization of a Deterministic Function Without the Knowledge of Its Smoothness." Neural Information Processing Systems, 2011.](https://mlanthology.org/neurips/2011/munos2011neurips-optimistic/)BibTeX
@inproceedings{munos2011neurips-optimistic,
title = {{Optimistic Optimization of a Deterministic Function Without the Knowledge of Its Smoothness}},
author = {Munos, Rémi},
booktitle = {Neural Information Processing Systems},
year = {2011},
pages = {783-791},
url = {https://mlanthology.org/neurips/2011/munos2011neurips-optimistic/}
}