Stochastic Search Using the Natural Gradient
Abstract
To optimize unknown `fitness' functions, we introduce Natural Search, a novel stochastic search method that constitutes a principled alternative to standard evolutionary methods. It maintains a multinormal distribution on the set of solution candidates. The Natural Gradient is used to update the distribution's parameters in the direction of higher expected fitness, by efficiently calculating the inverse of the exact Fisher information matrix whereas previous methods had to use approximations. Other novel aspects of our method include optimal fitness baselines and importance mixing, a procedure adjusting batches with minimal numbers of fitness evaluations. The algorithm yields competitive results on a number of benchmarks.
Cite
Text
Sun et al. "Stochastic Search Using the Natural Gradient." International Conference on Machine Learning, 2009. doi:10.1145/1553374.1553522Markdown
[Sun et al. "Stochastic Search Using the Natural Gradient." International Conference on Machine Learning, 2009.](https://mlanthology.org/icml/2009/sun2009icml-stochastic/) doi:10.1145/1553374.1553522BibTeX
@inproceedings{sun2009icml-stochastic,
title = {{Stochastic Search Using the Natural Gradient}},
author = {Sun, Yi and Wierstra, Daan and Schaul, Tom and Schmidhuber, Jürgen},
booktitle = {International Conference on Machine Learning},
year = {2009},
pages = {1161-1168},
doi = {10.1145/1553374.1553522},
url = {https://mlanthology.org/icml/2009/sun2009icml-stochastic/}
}