Boosting Verified Training for Robust Image Classifications via Abstraction

Abstract

This paper proposes a novel, abstraction-based, certified training method for robust image classifiers. Via abstraction, all perturbed images are mapped into intervals before feeding into neural networks for training. By training on intervals, all the perturbed images that are mapped to the same interval are classified as the same label, rendering the variance of training sets to be small and the loss landscape of the models to be smooth. Consequently, our approach significantly improves the robustness of trained models. For the abstraction, our training method also enables a sound and complete black-box verification approach, which is orthogonal and scalable to arbitrary types of neural networks regardless of their sizes and architectures. We evaluate our method on a wide range of benchmarks in different scales. The experimental results show that our method outperforms state of the art by (i) reducing the verified errors of trained models up to 95.64%; (ii) totally achieving up to 602.50x speedup; and (iii) scaling up to larger models with up to 138 million trainable parameters. The demo is available at https://github.com/zhangzhaodi233/ABSCERT.git.

Cite

Text

Zhang et al. "Boosting Verified Training for Robust Image Classifications via Abstraction." Conference on Computer Vision and Pattern Recognition, 2023. doi:10.1109/CVPR52729.2023.01559

Markdown

[Zhang et al. "Boosting Verified Training for Robust Image Classifications via Abstraction." Conference on Computer Vision and Pattern Recognition, 2023.](https://mlanthology.org/cvpr/2023/zhang2023cvpr-boosting/) doi:10.1109/CVPR52729.2023.01559

BibTeX

@inproceedings{zhang2023cvpr-boosting,
  title     = {{Boosting Verified Training for Robust Image Classifications via Abstraction}},
  author    = {Zhang, Zhaodi and Xue, Zhiyi and Chen, Yang and Liu, Si and Zhang, Yueling and Liu, Jing and Zhang, Min},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2023},
  pages     = {16251-16260},
  doi       = {10.1109/CVPR52729.2023.01559},
  url       = {https://mlanthology.org/cvpr/2023/zhang2023cvpr-boosting/}
}