Curse of Dimensionality on Randomized Smoothing for Certifiable Robustness

Abstract

Randomized smoothing, using just a simple isotropic Gaussian distribution, has been shown to produce good robustness guarantees against $\ell_2$-norm bounded adversaries. In this work, we show that extending the smoothing technique to defend against other attack models can be challenging, especially in the high-dimensional regime. In particular, for a vast class of i.i.d. smoothing distributions, we prove that the largest $\ell_p$-radius that can be certified decreases as $O(1/d^{\frac{1}{2} - \frac{1}{p}})$ with dimension $d$ for $p > 2$. Notably, for $p \geq 2$, this dependence on $d$ is no better than that of the $\ell_p$-radius that can be certified using isotropic Gaussian smoothing, essentially putting a matching lower bound on the robustness radius. When restricted to \emph{generalized} Gaussian smoothing, these two bounds can be shown to be within a constant factor of each other in an asymptotic sense, establishing that Gaussian smoothing provides the best possible results, up to a constant factor, when $p \geq 2$. We present experimental results on CIFAR to validate our theory. For other smoothing distributions, such as, a uniform distribution within an $\ell_1$ or an $\ell_\infty$-norm ball, we show upper bounds of the form $O(1 / d)$ and $O(1 / d^{1 - \frac{1}{p}})$ respectively, which have an even worse dependence on $d$.

Cite

Text

Kumar et al. "Curse of Dimensionality on Randomized Smoothing for Certifiable Robustness." International Conference on Machine Learning, 2020.

Markdown

[Kumar et al. "Curse of Dimensionality on Randomized Smoothing for Certifiable Robustness." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/kumar2020icml-curse/)

BibTeX

@inproceedings{kumar2020icml-curse,
  title     = {{Curse of Dimensionality on Randomized Smoothing for Certifiable Robustness}},
  author    = {Kumar, Aounon and Levine, Alexander and Goldstein, Tom and Feizi, Soheil},
  booktitle = {International Conference on Machine Learning},
  year      = {2020},
  pages     = {5458-5467},
  volume    = {119},
  url       = {https://mlanthology.org/icml/2020/kumar2020icml-curse/}
}