New Sample Complexity Bounds for Sample Average Approximation in Heavy-Tailed Stochastic Programming
Abstract
This paper studies sample average approximation (SAA) and its simple regularized variation in solving convex or strongly convex stochastic programming problems. Under heavy-tailed assumptions and comparable regularity conditions as in the typical SAA literature, we show — perhaps for the first time — that the sample complexity can be completely free from any complexity measure (e.g., logarithm of the covering number) of the feasible region. As a result, our new bounds can be more advantageous than the state-of-the-art in terms of the dependence on the problem dimensionality.
Cite
Text
Liu and Tong. "New Sample Complexity Bounds for Sample Average Approximation in Heavy-Tailed Stochastic Programming." International Conference on Machine Learning, 2024.Markdown
[Liu and Tong. "New Sample Complexity Bounds for Sample Average Approximation in Heavy-Tailed Stochastic Programming." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/liu2024icml-new/)BibTeX
@inproceedings{liu2024icml-new,
title = {{New Sample Complexity Bounds for Sample Average Approximation in Heavy-Tailed Stochastic Programming}},
author = {Liu, Hongcheng and Tong, Jindong},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {31933-31948},
volume = {235},
url = {https://mlanthology.org/icml/2024/liu2024icml-new/}
}