Information-Geometric Optimization Algorithms: A Unifying Picture via Invariance Principles

Abstract

We present a canonical way to turn any smooth parametric family of probability distributions on an arbitrary search space $X$ into a continuous-time black-box optimization method on $X$, the information-geometric optimization (IGO) method. Invariance as a major design principle keeps the number of arbitrary choices to a minimum. The resulting IGO flow is the flow of an ordinary differential equation conducting the natural gradient ascent of an adaptive, time-dependent transformation of the objective function. It makes no particular assumptions on the objective function to be optimized.

Cite

Text

Ollivier et al. "Information-Geometric Optimization Algorithms: A Unifying Picture via Invariance Principles." Journal of Machine Learning Research, 2017.

Markdown

[Ollivier et al. "Information-Geometric Optimization Algorithms: A Unifying Picture via Invariance Principles." Journal of Machine Learning Research, 2017.](https://mlanthology.org/jmlr/2017/ollivier2017jmlr-informationgeometric/)

BibTeX

@article{ollivier2017jmlr-informationgeometric,
  title     = {{Information-Geometric Optimization Algorithms: A Unifying Picture via Invariance Principles}},
  author    = {Ollivier, Yann and Arnold, Ludovic and Auger, Anne and Hansen, Nikolaus},
  journal   = {Journal of Machine Learning Research},
  year      = {2017},
  pages     = {1-65},
  volume    = {18},
  url       = {https://mlanthology.org/jmlr/2017/ollivier2017jmlr-informationgeometric/}
}