Abstraction Based Output Range Analysis for Neural Networks

Abstract

In this paper, we consider the problem of output range analysis for feed-forward neural networks. The current approaches reduce the problem to satisfiability and optimization solving which are NP-hard problems, and whose computational complexity increases with the number of neurons in the network. We present a novel abstraction technique that constructs a simpler neural network with fewer neurons, albeit with interval weights called interval neural network (INN) which over-approximates the output range of the given neural network. We reduce the output range analysis on the INNs to solving a mixed integer linear programming problem. Our experimental results highlight the trade-off between the computation time and the precision of the computed output range.

Cite

Text

Prabhakar and Afzal. "Abstraction Based Output Range Analysis for Neural Networks." Neural Information Processing Systems, 2019.

Markdown

[Prabhakar and Afzal. "Abstraction Based Output Range Analysis for Neural Networks." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/prabhakar2019neurips-abstraction/)

BibTeX

@inproceedings{prabhakar2019neurips-abstraction,
  title     = {{Abstraction Based Output Range Analysis for Neural Networks}},
  author    = {Prabhakar, Pavithra and Afzal, Zahra Rahimi},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {15788-15798},
  url       = {https://mlanthology.org/neurips/2019/prabhakar2019neurips-abstraction/}
}