Two Timescale Analysis of the Alopex Algorithm for Optimization
Abstract
Alopex is a correlation-based gradient-free optimization technique useful in many learning problems. However, there are no analytical results on the asymptotic behavior of this algorithm. This article presents a new version of Alopex that can be analyzed using techniques of two timescale stochastic approximation method. It is shown that the algorithm asymptotically behaves like a gradient-descent method, though it does not need (or estimate) any gradient information. It is also shown, through simulations, that the algorithm is quite effective.
Cite
Text
Sastry et al. "Two Timescale Analysis of the Alopex Algorithm for Optimization." Neural Computation, 2002. doi:10.1162/089976602760408044Markdown
[Sastry et al. "Two Timescale Analysis of the Alopex Algorithm for Optimization." Neural Computation, 2002.](https://mlanthology.org/neco/2002/sastry2002neco-two/) doi:10.1162/089976602760408044BibTeX
@article{sastry2002neco-two,
title = {{Two Timescale Analysis of the Alopex Algorithm for Optimization}},
author = {Sastry, P. S. and Magesh, M. and Unnikrishnan, K. P.},
journal = {Neural Computation},
year = {2002},
pages = {2729-2750},
doi = {10.1162/089976602760408044},
volume = {14},
url = {https://mlanthology.org/neco/2002/sastry2002neco-two/}
}