A Variational Perspective on High-Resolution ODEs
Abstract
We consider unconstrained minimization of smooth convex functions. We propose a novel variational perspective using forced Euler-Lagrange equation that allows for studying high-resolution ODEs. Through this, we obtain a faster convergence rate for gradient norm minimization using Nesterov's accelerated gradient method. Additionally, we show that Nesterov's method can be interpreted as a rate-matching discretization of an appropriately chosen high-resolution ODE. Finally, using the results from the new variational perspective, we propose a stochastic method for noisy gradients. Several numerical experiments compare and illustrate our stochastic algorithm with state of the art methods.
Cite
Text
Maskan et al. "A Variational Perspective on High-Resolution ODEs." Neural Information Processing Systems, 2023.Markdown
[Maskan et al. "A Variational Perspective on High-Resolution ODEs." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/maskan2023neurips-variational/)BibTeX
@inproceedings{maskan2023neurips-variational,
title = {{A Variational Perspective on High-Resolution ODEs}},
author = {Maskan, Hoomaan and Zygalakis, Konstantinos and Yurtsever, Alp},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/maskan2023neurips-variational/}
}