A Latent Variational Framework for Stochastic Optimization

Abstract

This paper provides a unifying theoretical framework for stochastic optimization algorithms by means of a latent stochastic variational problem. Using techniques from stochastic control, the solution to the variational problem is shown to be equivalent to that of a Forward Backward Stochastic Differential Equation (FBSDE). By solving these equations, we recover a variety of existing adaptive stochastic gradient descent methods. This framework establishes a direct connection between stochastic optimization algorithms and a secondary latent inference problem on gradients, where a prior measure on gradient observations determines the resulting algorithm.

Cite

Text

Casgrain. "A Latent Variational Framework for Stochastic Optimization." Neural Information Processing Systems, 2019.

Markdown

[Casgrain. "A Latent Variational Framework for Stochastic Optimization." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/casgrain2019neurips-latent/)

BibTeX

@inproceedings{casgrain2019neurips-latent,
  title     = {{A Latent Variational Framework for Stochastic Optimization}},
  author    = {Casgrain, Philippe},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {5646-5656},
  url       = {https://mlanthology.org/neurips/2019/casgrain2019neurips-latent/}
}