The Common-Directions Method for Regularized Empirical Risk Minimization
Abstract
State-of-the-art first- and second-order optimization methods are able to achieve either fast global linear convergence rates or quadratic convergence, but not both of them. In this work, we propose an interpolation between first- and second-order methods for regularized empirical risk minimization that exploits the problem structure to efficiently combine multiple update directions. Our method attains both optimal global linear convergence rate for first-order methods, and local quadratic convergence. Experimental results show that our method outperforms state-of-the-art first- and second-order optimization methods in terms of the number of data accesses, while is competitive in training time.
Cite
Text
Wang et al. "The Common-Directions Method for Regularized Empirical Risk Minimization." Journal of Machine Learning Research, 2019.Markdown
[Wang et al. "The Common-Directions Method for Regularized Empirical Risk Minimization." Journal of Machine Learning Research, 2019.](https://mlanthology.org/jmlr/2019/wang2019jmlr-commondirections/)BibTeX
@article{wang2019jmlr-commondirections,
title = {{The Common-Directions Method for Regularized Empirical Risk Minimization}},
author = {Wang, Po-Wei and Lee, Ching-pei and Lin, Chih-Jen},
journal = {Journal of Machine Learning Research},
year = {2019},
pages = {1-49},
volume = {20},
url = {https://mlanthology.org/jmlr/2019/wang2019jmlr-commondirections/}
}