Linearization Algorithms for Fully Composite Optimization
Abstract
This paper studies first-order algorithms for solving fully composite optimization problems over convex and compact sets. We leverage the structure of the objective by handling its differentiable and non-differentiable components separately, linearizing only the smooth parts. This provides us with new generalizations of the classical Frank-Wolfe method and the Conditional Gradient Sliding algorithm, that cater to a subclass of non-differentiable problems. Our algorithms rely on a stronger version of the linear minimization oracle, which can be efficiently implemented in several practical applications. We provide the basic version of our method with an affine-invariant analysis and prove global convergence rates for both convex and non-convex objectives. Furthermore, in the convex case, we propose an accelerated method with correspondingly improved complexity. Finally, we provide illustrative experiments to support our theoretical results.
Cite
Text
Vladarean et al. "Linearization Algorithms for Fully Composite Optimization." Conference on Learning Theory, 2023.Markdown
[Vladarean et al. "Linearization Algorithms for Fully Composite Optimization." Conference on Learning Theory, 2023.](https://mlanthology.org/colt/2023/vladarean2023colt-linearization/)BibTeX
@inproceedings{vladarean2023colt-linearization,
title = {{Linearization Algorithms for Fully Composite Optimization}},
author = {Vladarean, Maria-Luiza and Doikov, Nikita and Jaggi, Martin and Flammarion, Nicolas},
booktitle = {Conference on Learning Theory},
year = {2023},
pages = {3669-3695},
volume = {195},
url = {https://mlanthology.org/colt/2023/vladarean2023colt-linearization/}
}