Efficient Quadratic Corrections for Frank-Wolfe Algorithms

Abstract

We develop a Frank-Wolfe algorithm with corrective steps, generalizing previous algorithms including Blended Conditional Gradients, Blended Pairwise Conditional Gradients, and Fully-Corrective Frank-Wolfe. For this, we prove tight convergence guarantees together with an optimal face identification property. Furthermore, we propose two highly efficient corrective steps for convex quadratic objectives based on linear optimization or linear system solving, akin to Wolfe's Minimum-Norm Point algorithm, and prove finite-time convergence under suitable conditions. Beyond optimization problems that are directly quadratic, we revisit two algorithms, Split Conditional Gradient and Second-Order Conditional Gradient Sliding, which can leverage quadratic corrections to accelerate the solution of their quadratic subproblems. We show improved convergence rates for the first and prove broader applicability for the second. Finally, we demonstrate substantial computational speedups for Frank-Wolfe-based algorithms with quadratic corrections across the considered problem classes.

Cite

Text

Halbey et al. "Efficient Quadratic Corrections for Frank-Wolfe Algorithms." Advances in Neural Information Processing Systems, 2025.

Markdown

[Halbey et al. "Efficient Quadratic Corrections for Frank-Wolfe Algorithms." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/halbey2025neurips-efficient/)

BibTeX

@inproceedings{halbey2025neurips-efficient,
  title     = {{Efficient Quadratic Corrections for Frank-Wolfe Algorithms}},
  author    = {Halbey, Jannis and Rakotomandimby, Seta and Besançon, Mathieu and Designolle, Sébastien and Pokutta, Sebastian},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/halbey2025neurips-efficient/}
}