Convergence Rate of a Simulated Annealing Algorithm with Noisy Observations
Abstract
In this paper we propose a modified version of the simulated annealing algorithm for solving a stochastic global optimization problem. More precisely, we address the problem of finding a global minimizer of a function with noisy evaluations. We provide a rate of convergence and its optimized parametrization to ensure a minimal number of evaluations for a given accuracy and a confidence level close to 1. This work is completed with a set of numerical experimentations and assesses the practical performance both on benchmark test cases and on real world examples.
Cite
Text
Bouttier and Gavra. "Convergence Rate of a Simulated Annealing Algorithm with Noisy Observations." Journal of Machine Learning Research, 2019.Markdown
[Bouttier and Gavra. "Convergence Rate of a Simulated Annealing Algorithm with Noisy Observations." Journal of Machine Learning Research, 2019.](https://mlanthology.org/jmlr/2019/bouttier2019jmlr-convergence/)BibTeX
@article{bouttier2019jmlr-convergence,
title = {{Convergence Rate of a Simulated Annealing Algorithm with Noisy Observations}},
author = {Bouttier, Clément and Gavra, Ioana},
journal = {Journal of Machine Learning Research},
year = {2019},
pages = {1-45},
volume = {20},
url = {https://mlanthology.org/jmlr/2019/bouttier2019jmlr-convergence/}
}