A Stochastic Prox-Linear Method for CVaR Minimization

Abstract

We develop an instance of the stochastic prox-linear method for minimizing the Conditional Value-at-Risk (CVaR) objective. CVaR is a risk measure focused on minimizing worst-case performance, defined as the average of the top quantile of the losses. In machine learning, such a risk measure is useful to train more robust models. Although the stochastic subgradient method (SGM) is a natural choice for minimizing CVaR objective, we show that the prox-linear algorithm can be used to better exploit the structure of the objective, while still providing a convenient closed form update. We then specialize a general convergence theorem for the prox-linear method to our setting, and show that it allows for a wider selection of step sizes compared to SGM. We support this theoretical finding experimentally, by showing that the performance of stochastic prox-linear is more robust to the choice of step size compared to SGM.

Cite

Text

Meng et al. "A Stochastic Prox-Linear Method for CVaR Minimization." NeurIPS 2022 Workshops: OPT, 2022.

Markdown

[Meng et al. "A Stochastic Prox-Linear Method for CVaR Minimization." NeurIPS 2022 Workshops: OPT, 2022.](https://mlanthology.org/neuripsw/2022/meng2022neuripsw-stochastic/)

BibTeX

@inproceedings{meng2022neuripsw-stochastic,
  title     = {{A Stochastic Prox-Linear Method for CVaR Minimization}},
  author    = {Meng, Si Yi and Charisopoulos, Vasileios and Gower, Robert M.},
  booktitle = {NeurIPS 2022 Workshops: OPT},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/meng2022neuripsw-stochastic/}
}