Mean-Field Langevin Dynamics : Exponential Convergence and Annealing
Abstract
Noisy particle gradient descent (NPGD) is an algorithm to minimize convex functions over the space of measures that include an entropy term. In the many-particle limit, this algorithm is described by a Mean-Field Langevin dynamics---a generalization of the Langevin dynamic with a non-linear drift---which is our main object of study. Previous work have shown its convergence to the unique minimizer via non-quantitative arguments. We prove that this dynamics converges at an exponential rate, under the assumption that a certain family of Log-Sobolev inequalities holds. This assumption holds for instance for the minimization of the risk of certain two-layer neural networks, where NPGD is equivalent to standard noisy gradient descent. We also study the annealed dynamics, and show that for a noise decaying at a logarithmic rate, the dynamics converges in value to the global minimizer of the unregularized objective function.
Cite
Text
Chizat. "Mean-Field Langevin Dynamics : Exponential Convergence and Annealing." Transactions on Machine Learning Research, 2022.Markdown
[Chizat. "Mean-Field Langevin Dynamics : Exponential Convergence and Annealing." Transactions on Machine Learning Research, 2022.](https://mlanthology.org/tmlr/2022/chizat2022tmlr-meanfield/)BibTeX
@article{chizat2022tmlr-meanfield,
title = {{Mean-Field Langevin Dynamics : Exponential Convergence and Annealing}},
author = {Chizat, Lénaïc},
journal = {Transactions on Machine Learning Research},
year = {2022},
url = {https://mlanthology.org/tmlr/2022/chizat2022tmlr-meanfield/}
}