The Cost of Robustness: Tighter Bounds on Parameter Complexity for Robust Memorization in ReLU Nets
Abstract
We study the parameter complexity of robust memorization for ReLU networks: the number of parameters required to interpolate any dataset with $\epsilon$-separation between differently labeled points, while ensuring predictions remain consistent within a $\mu$-ball around each training example. We establish upper and lower bounds on the parameter count as a function of the robustness ratio $\rho = \mu / \epsilon$. Unlike prior work, we provide a fine-grained analysis across the entire range $\rho \in (0,1)$ and obtain tighter upper and lower bounds that improve upon existing results. Our findings reveal that the parameter complexity of robust memorization matches that of non-robust memorization when $\rho$ is small, but grows with increasing $\rho$. As a special case, when the input dimension is comparable to or exceeds the dataset size, our bounds become tight (up to logarithmic factors) across the entire range of $\rho$.
Cite
Text
Kim et al. "The Cost of Robustness: Tighter Bounds on Parameter Complexity for Robust Memorization in ReLU Nets." Advances in Neural Information Processing Systems, 2025.Markdown
[Kim et al. "The Cost of Robustness: Tighter Bounds on Parameter Complexity for Robust Memorization in ReLU Nets." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/kim2025neurips-cost/)BibTeX
@inproceedings{kim2025neurips-cost,
title = {{The Cost of Robustness: Tighter Bounds on Parameter Complexity for Robust Memorization in ReLU Nets}},
author = {Kim, Yujun and Moon, Chaewon and Yun, Chulhee},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/kim2025neurips-cost/}
}