Composite Self-Concordant Minimization
Abstract
We propose a variable metric framework for minimizing the sum of a self-concordant function and a possibly non-smooth convex function, endowed with an easily computable proximal operator. We theoretically establish the convergence of our framework without relying on the usual Lipschitz gradient assumption on the smooth part. An important highlight of our work is a new set of analytic step-size selection and correction procedures based on the structure of the problem. We describe concrete algorithmic instances of our framework for several interesting applications and demonstrate them numerically on both synthetic and real data.
Cite
Text
Tran-Dinh et al. "Composite Self-Concordant Minimization." Journal of Machine Learning Research, 2015.Markdown
[Tran-Dinh et al. "Composite Self-Concordant Minimization." Journal of Machine Learning Research, 2015.](https://mlanthology.org/jmlr/2015/trandinh2015jmlr-composite/)BibTeX
@article{trandinh2015jmlr-composite,
title = {{Composite Self-Concordant Minimization}},
author = {Tran-Dinh, Quoc and Kyrillidis, Anastasios and Cevher, Volkan},
journal = {Journal of Machine Learning Research},
year = {2015},
pages = {371-416},
volume = {16},
url = {https://mlanthology.org/jmlr/2015/trandinh2015jmlr-composite/}
}