A Simple Analysis for Exp-Concave Empirical Minimization with Arbitrary Convex Regularizer
Abstract
In this paper, we present a simple analysis of {\bf fast rates} with {\it high probability} of {\bf empirical minimization} for {\it stochastic composite optimization} over a finite-dimensional bounded convex set with exponential concave loss functions and an arbitrary convex regularization. To the best of our knowledge, this result is the first of its kind. As a byproduct, we can directly obtain the fast rate with {\it high probability} for exponential concave empirical risk minimization with and without any convex regularization, which not only extends existing results of empirical risk minimization but also provides a unified framework for analyzing exponential concave empirical risk minimization with and without {\it any} convex regularization. Our proof is very simple only exploiting the covering number of a finite-dimensional bounded set and a concentration inequality of random vectors.
Cite
Text
Yang et al. "A Simple Analysis for Exp-Concave Empirical Minimization with Arbitrary Convex Regularizer." International Conference on Artificial Intelligence and Statistics, 2018.Markdown
[Yang et al. "A Simple Analysis for Exp-Concave Empirical Minimization with Arbitrary Convex Regularizer." International Conference on Artificial Intelligence and Statistics, 2018.](https://mlanthology.org/aistats/2018/yang2018aistats-simple/)BibTeX
@inproceedings{yang2018aistats-simple,
title = {{A Simple Analysis for Exp-Concave Empirical Minimization with Arbitrary Convex Regularizer}},
author = {Yang, Tianbao and Li, Zhe and Zhang, Lijun},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2018},
pages = {445-453},
url = {https://mlanthology.org/aistats/2018/yang2018aistats-simple/}
}