Boosting Frank-Wolfe by Chasing Gradients
Abstract
The Frank-Wolfe algorithm has become a popular first-order optimization algorithm for it is simple and projection-free, and it has been successfully applied to a variety of real-world problems. Its main drawback however lies in its convergence rate, which can be excessively slow due to naive descent directions. We propose to speed up the Frank-Wolfe algorithm by better aligning the descent direction with that of the negative gradient via a subroutine. This subroutine chases the negative gradient direction in a matching pursuit-style while still preserving the projection-free property. Although the approach is reasonably natural, it produces very significant results. We derive convergence rates $\mathcal{O}(1/t)$ to $\mathcal{O}(e^{-\omega t})$ of our method and we demonstrate its competitive advantage both per iteration and in CPU time over the state-of-the-art in a series of computational experiments.
Cite
Text
Combettes and Pokutta. "Boosting Frank-Wolfe by Chasing Gradients." International Conference on Machine Learning, 2020.Markdown
[Combettes and Pokutta. "Boosting Frank-Wolfe by Chasing Gradients." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/combettes2020icml-boosting/)BibTeX
@inproceedings{combettes2020icml-boosting,
title = {{Boosting Frank-Wolfe by Chasing Gradients}},
author = {Combettes, Cyrille and Pokutta, Sebastian},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {2111-2121},
volume = {119},
url = {https://mlanthology.org/icml/2020/combettes2020icml-boosting/}
}