Linearly Convergent Frank-Wolfe with Backtracking Line-Search
Abstract
Structured constraints in Machine Learning have recently brought the Frank-Wolfe (FW) family of algorithms back in the spotlight. While the classical FW algorithm has poor local convergence properties, the Away-steps and Pairwise FW variants have emerged as improved variants with faster convergence. However, these improved variants suffer from two practical limitations: they require at each iteration to solve a 1-dimensional minimization problem to set the step-size and also require the Frank-Wolfe linear subproblems to be solved exactly. In this paper we propose variants of Away-steps and Pairwise FW that lift both restrictions simultaneously. The proposed methods set the step-size based on a sufficient decrease condition, and do not require prior knowledge of the objective. Furthermore, they inherit all the favorable convergence properties of the exact line-search version, including linear convergence for strongly convex functions over polytopes. Benchmarks on different machine learning problems illustrate large performance gains of the proposed variants.
Cite
Text
Pedregosa et al. "Linearly Convergent Frank-Wolfe with Backtracking Line-Search." Artificial Intelligence and Statistics, 2020.Markdown
[Pedregosa et al. "Linearly Convergent Frank-Wolfe with Backtracking Line-Search." Artificial Intelligence and Statistics, 2020.](https://mlanthology.org/aistats/2020/pedregosa2020aistats-linearly/)BibTeX
@inproceedings{pedregosa2020aistats-linearly,
title = {{Linearly Convergent Frank-Wolfe with Backtracking Line-Search}},
author = {Pedregosa, Fabian and Negiar, Geoffrey and Askari, Armin and Jaggi, Martin},
booktitle = {Artificial Intelligence and Statistics},
year = {2020},
pages = {1-10},
volume = {108},
url = {https://mlanthology.org/aistats/2020/pedregosa2020aistats-linearly/}
}