Information Complexity of Stochastic Convex Optimization: Applications to Generalization, Memorization, and Tracing
Abstract
In this work, we investigate the interplay between memorization and learning in the context of stochastic convex optimization (SCO). We define memorization via the information a learning algorithm reveals about its training data points. We then quantify this information using the framework of conditional mutual information (CMI) proposed by Steinke and Zakynthinou (2020). Our main result is a precise characterization of the tradeoff between the accuracy of a learning algorithm and its CMI, answering an open question posed by Livni (2023). We show that, in the $L^2$ Lipschitz–bounded setting and under strong convexity, every learner with an excess error $\epsilon$ has CMI bounded below by $\Omega(1/\epsilon^2)$ and $\Omega(1/\epsilon)$, respectively. We further demonstrate the essential role of memorization in learning problems in SCO by designing an adversary capable of accurately identifying a significant fraction of the training samples in specific SCO problems. Finally, we enumerate several implications of our results, such as a limitation of generalization bounds based on CMI and the incompressibility of samples in SCO problems.
Cite
Text
Attias et al. "Information Complexity of Stochastic Convex Optimization: Applications to Generalization, Memorization, and Tracing." International Conference on Machine Learning, 2024.Markdown
[Attias et al. "Information Complexity of Stochastic Convex Optimization: Applications to Generalization, Memorization, and Tracing." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/attias2024icml-information/)BibTeX
@inproceedings{attias2024icml-information,
title = {{Information Complexity of Stochastic Convex Optimization: Applications to Generalization, Memorization, and Tracing}},
author = {Attias, Idan and Dziugaite, Gintare Karolina and Haghifam, Mahdi and Livni, Roi and Roy, Daniel M.},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {2035-2068},
volume = {235},
url = {https://mlanthology.org/icml/2024/attias2024icml-information/}
}