Distributionally Robust Optimization via Ball Oracle Acceleration
Abstract
We develop and analyze algorithms for distributionally robust optimization (DRO) of convex losses. In particular, we consider group-structured and bounded $f$-divergence uncertainty sets. Our approach relies on an accelerated method that queries a ball optimization oracle, i.e., a subroutine that minimizes the objective within a small ball around the query point. Our main contribution is efficient implementations of this oracle for DRO objectives. For DRO with $N$ non-smooth loss functions, the resulting algorithms find an $\epsilon$-accurate solution with $\widetilde{O}\left(N\epsilon^{-2/3} + \epsilon^{-2}\right)$ first-order oracle queries to individual loss functions. Compared to existing algorithms for this problem, we improve complexity by a factor of up to $\epsilon^{-4/3}$.
Cite
Text
Carmon and Hausler. "Distributionally Robust Optimization via Ball Oracle Acceleration." Neural Information Processing Systems, 2022.Markdown
[Carmon and Hausler. "Distributionally Robust Optimization via Ball Oracle Acceleration." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/carmon2022neurips-distributionally/)BibTeX
@inproceedings{carmon2022neurips-distributionally,
title = {{Distributionally Robust Optimization via Ball Oracle Acceleration}},
author = {Carmon, Yair and Hausler, Danielle},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/carmon2022neurips-distributionally/}
}