Stochastic Constrained DRO with a Complexity Independent of Sample Size
Abstract
Distributionally Robust Optimization (DRO), as a popular method to train robust models against distribution shift between training and test sets, has received tremendous attention in recent years. In this paper, we propose and analyze stochastic algorithms that apply to both non-convex and convex losses for solving Kullback–Leibler divergence constrained DRO problem. Compared with existing methods solving this problem, our stochastic algorithms not only enjoy competitive if not better complexity independent of sample size but also just require a constant batch size at every iteration, which is more practical for broad applications. We establish a nearly optimal complexity bound for finding an $\epsilon$-stationary solution for non-convex losses and an optimal complexity for finding an $\epsilon$-optimal solution for convex losses. Empirical studies demonstrate the effectiveness of the proposed algorithms for solving non-convex and convex constrained DRO problems.
Cite
Text
Qi et al. "Stochastic Constrained DRO with a Complexity Independent of Sample Size." Transactions on Machine Learning Research, 2023.Markdown
[Qi et al. "Stochastic Constrained DRO with a Complexity Independent of Sample Size." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/qi2023tmlr-stochastic/)BibTeX
@article{qi2023tmlr-stochastic,
title = {{Stochastic Constrained DRO with a Complexity Independent of Sample Size}},
author = {Qi, Qi and Lyu, Jiameng and Chan, Kung-Sik and Bai, Er-Wei and Yang, Tianbao},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/qi2023tmlr-stochastic/}
}