Non-Asymptotic and Non-Lipschitzian Bounds on Optimal Values in Stochastic Optimization Under Heavy Tails
Abstract
This paper focuses on non-asymptotic confidence bounds (CB) for the optimal values of stochastic optimization (SO) problems. Existing approaches often rely on two conditions that may be restrictive: The need for a global Lipschitz constant and the assumption of light-tailed distributions. Beyond either of the conditions, it remains largely unknown whether computable CBs can be constructed. In view of this literature gap, we provide three key findings below: (i) Based on the conventional formulation of sample average approximation (SAA), we derive non-Lipschitzian CBs for convex SP problems under heavy tails. (ii) We explore diametrical risk minimization (DRM)—a recently introduced modification to SAA—and attain non-Lipschitzian CBs for nonconvex SP problems in light-tailed settings. (iii) We extend our analysis of DRM to handle heavy-tailed randomness by utilizing properties in formulations for training over-parameterized classification models.
Cite
Text
Tong et al. "Non-Asymptotic and Non-Lipschitzian Bounds on Optimal Values in Stochastic Optimization Under Heavy Tails." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Tong et al. "Non-Asymptotic and Non-Lipschitzian Bounds on Optimal Values in Stochastic Optimization Under Heavy Tails." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/tong2025icml-nonasymptotic/)BibTeX
@inproceedings{tong2025icml-nonasymptotic,
title = {{Non-Asymptotic and Non-Lipschitzian Bounds on Optimal Values in Stochastic Optimization Under Heavy Tails}},
author = {Tong, Jindong and Liu, Hongcheng and Royset, Johannes O.},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {59812-59828},
volume = {267},
url = {https://mlanthology.org/icml/2025/tong2025icml-nonasymptotic/}
}