Reproducibility in Optimization: Theoretical Framework and Limits
Abstract
We initiate a formal study of reproducibility in optimization. We define a quantitative measure of reproducibility of optimization procedures in the face of noisy or error-prone operations such as inexact or stochastic gradient computations or inexact initialization. We then analyze several convex optimization settings of interest such as smooth, non-smooth, and strongly-convex objective functions and establish tight bounds on the limits of reproducibility in each setting. Our analysis reveals a fundamental trade-off between computation and reproducibility: more computation is necessary (and sufficient) for better reproducibility.
Cite
Text
Ahn et al. "Reproducibility in Optimization: Theoretical Framework and Limits." Neural Information Processing Systems, 2022.Markdown
[Ahn et al. "Reproducibility in Optimization: Theoretical Framework and Limits." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/ahn2022neurips-reproducibility/)BibTeX
@inproceedings{ahn2022neurips-reproducibility,
title = {{Reproducibility in Optimization: Theoretical Framework and Limits}},
author = {Ahn, Kwangjun and Jain, Prateek and Ji, Ziwei and Kale, Satyen and Netrapalli, Praneeth and Shamir, Gil I},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/ahn2022neurips-reproducibility/}
}