Follow the Flow: Proximal Flow Inspired Multi-Step Methods

Abstract

We investigate a family of Multi-Step Proximal Point Methods, the Backwards Differentiation For- mulas, which are inspired by implicit linear discretization of gradient flow. The resulting meth- ods are multi-step proximal point methods, with similar computational cost in each update as the proximal point method. We explore several optimization methods where applying an approximate multistep proximal points method results in improved convergence behavior. We argue that this is the result of the lowering of truncation error in approximating gradient flow.

Cite

Text

Huang and Sun. "Follow the Flow: Proximal Flow Inspired Multi-Step Methods." NeurIPS 2023 Workshops: OPT, 2023.

Markdown

[Huang and Sun. "Follow the Flow: Proximal Flow Inspired Multi-Step Methods." NeurIPS 2023 Workshops: OPT, 2023.](https://mlanthology.org/neuripsw/2023/huang2023neuripsw-follow/)

BibTeX

@inproceedings{huang2023neuripsw-follow,
  title     = {{Follow the Flow: Proximal Flow Inspired Multi-Step Methods}},
  author    = {Huang, Yushen and Sun, Yifan},
  booktitle = {NeurIPS 2023 Workshops: OPT},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/huang2023neuripsw-follow/}
}