Universal Approximation with Certified Networks

Abstract

Training neural networks to be certifiably robust is critical to ensure their safety against adversarial attacks. However, it is currently very difficult to train a neural network that is both accurate and certifiably robust. In this work we take a step towards addressing this challenge. We prove that for every continuous function $f$, there exists a network $n$ such that: (i) $n$ approximates $f$ arbitrarily close, and (ii) simple interval bound propagation of a region $B$ through $n$ yields a result that is arbitrarily close to the optimal output of $f$ on $B$. Our result can be seen as a Universal Approximation Theorem for interval-certified ReLU networks. To the best of our knowledge, this is the first work to prove the existence of accurate, interval-certified networks.

Cite

Text

Baader et al. "Universal Approximation with Certified Networks." International Conference on Learning Representations, 2020.

Markdown

[Baader et al. "Universal Approximation with Certified Networks." International Conference on Learning Representations, 2020.](https://mlanthology.org/iclr/2020/baader2020iclr-universal/)

BibTeX

@inproceedings{baader2020iclr-universal,
  title     = {{Universal Approximation with Certified Networks}},
  author    = {Baader, Maximilian and Mirman, Matthew and Vechev, Martin},
  booktitle = {International Conference on Learning Representations},
  year      = {2020},
  url       = {https://mlanthology.org/iclr/2020/baader2020iclr-universal/}
}