Convex Optimization Based on Global Lower Second-Order Models
Abstract
In this work, we present new second-order algorithms for composite convex optimization, called Contracting-domain Newton methods. These algorithms are affine-invariant and based on global second-order lower approximation for the smooth component of the objective. Our approach has an interpretation both as a second-order generalization of the conditional gradient method, or as a variant of trust-region scheme. Under the assumption, that the problem domain is bounded, we prove $O(1/k^2)$ global rate of convergence in functional residual, where $k$ is the iteration counter, minimizing convex functions with Lipschitz continuous Hessian. This significantly improves the previously known bound $O(1/k)$ for this type of algorithms. Additionally, we propose a stochastic extension of our method, and present computational results for solving empirical risk minimization problem.
Cite
Text
Doikov and Nesterov. "Convex Optimization Based on Global Lower Second-Order Models." Neural Information Processing Systems, 2020.Markdown
[Doikov and Nesterov. "Convex Optimization Based on Global Lower Second-Order Models." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/doikov2020neurips-convex/)BibTeX
@inproceedings{doikov2020neurips-convex,
title = {{Convex Optimization Based on Global Lower Second-Order Models}},
author = {Doikov, Nikita and Nesterov, Yurii},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/doikov2020neurips-convex/}
}