Iteratively Enhanced Semidefinite Relaxations for Efficient Neural Network Verification
Abstract
We propose an enhanced semidefinite program (SDP) relaxation to enable the tight and efficient verification of neural networks (NNs). The tightness improvement is achieved by introducing a nonlinear constraint to existing SDP relaxations previously proposed for NN verification. The efficiency of the proposal stems from the iterative nature of the proposed algorithm in that it solves the resulting non-convex SDP by recursively solving auxiliary convex layer-based SDP problems. We show formally that the solution generated by our algorithm is tighter than state-of-the-art SDP-based solutions for the problem. We also show that the solution sequence converges to the optimal solution of the non-convex enhanced SDP relaxation. The experimental results on standard benchmarks in the area show that our algorithm achieves the state-of-the-art performance whilst maintaining an acceptable computational cost.
Cite
Text
Lan et al. "Iteratively Enhanced Semidefinite Relaxations for Efficient Neural Network Verification." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I12.26744Markdown
[Lan et al. "Iteratively Enhanced Semidefinite Relaxations for Efficient Neural Network Verification." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/lan2023aaai-iteratively/) doi:10.1609/AAAI.V37I12.26744BibTeX
@inproceedings{lan2023aaai-iteratively,
title = {{Iteratively Enhanced Semidefinite Relaxations for Efficient Neural Network Verification}},
author = {Lan, Jianglin and Zheng, Yang and Lomuscio, Alessio},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {14937-14945},
doi = {10.1609/AAAI.V37I12.26744},
url = {https://mlanthology.org/aaai/2023/lan2023aaai-iteratively/}
}