Decomposition-Invariant Conditional Gradient for General Polytopes with Line Search
Abstract
Frank-Wolfe (FW) algorithms with linear convergence rates have recently achieved great efficiency in many applications. Garber and Meshi (2016) designed a new decomposition-invariant pairwise FW variant with favorable dependency on the domain geometry. Unfortunately, it applies only to a restricted class of polytopes and cannot achieve theoretical and practical efficiency at the same time. In this paper, we show that by employing an away-step update, similar rates can be generalized to arbitrary polytopes with strong empirical performance. A new "condition number" of the domain is introduced which allows leveraging the sparsity of the solution. We applied the method to a reformulation of SVM, and the linear convergence rate depends, for the first time, on the number of support vectors.
Cite
Text
Bashiri and Zhang. "Decomposition-Invariant Conditional Gradient for General Polytopes with Line Search." Neural Information Processing Systems, 2017.Markdown
[Bashiri and Zhang. "Decomposition-Invariant Conditional Gradient for General Polytopes with Line Search." Neural Information Processing Systems, 2017.](https://mlanthology.org/neurips/2017/bashiri2017neurips-decompositioninvariant/)BibTeX
@inproceedings{bashiri2017neurips-decompositioninvariant,
title = {{Decomposition-Invariant Conditional Gradient for General Polytopes with Line Search}},
author = {Bashiri, Mohammad Ali and Zhang, Xinhua},
booktitle = {Neural Information Processing Systems},
year = {2017},
pages = {2690-2700},
url = {https://mlanthology.org/neurips/2017/bashiri2017neurips-decompositioninvariant/}
}