Towards Scalable Complete Verification of ReLU Neural Networks via Dependency-Based Branching

Abstract

We introduce an efficient method for the complete verification of ReLU-based feed-forward neural networks. The method implements branching on the ReLU states on the basis of a notion of dependency between the nodes. This results in dividing the original verification problem into a set of sub-problems whose MILP formulations require fewer integrality constraints. We evaluate the method on all of the ReLU-based fully connected networks from the first competition for neural network verification. The experimental results obtained show 145% performance gains over the present state-of-the-art in complete verification.

Cite

Text

Kouvaros and Lomuscio. "Towards Scalable Complete Verification of ReLU Neural Networks via Dependency-Based Branching." International Joint Conference on Artificial Intelligence, 2021. doi:10.24963/IJCAI.2021/364

Markdown

[Kouvaros and Lomuscio. "Towards Scalable Complete Verification of ReLU Neural Networks via Dependency-Based Branching." International Joint Conference on Artificial Intelligence, 2021.](https://mlanthology.org/ijcai/2021/kouvaros2021ijcai-scalable/) doi:10.24963/IJCAI.2021/364

BibTeX

@inproceedings{kouvaros2021ijcai-scalable,
  title     = {{Towards Scalable Complete Verification of ReLU Neural Networks via Dependency-Based Branching}},
  author    = {Kouvaros, Panagiotis and Lomuscio, Alessio},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {2643-2650},
  doi       = {10.24963/IJCAI.2021/364},
  url       = {https://mlanthology.org/ijcai/2021/kouvaros2021ijcai-scalable/}
}