Tessellation-Filtering ReLU Neural Networks
Abstract
We identify tessellation-filtering ReLU neural networks that, when composed with another ReLU network, keep its non-redundant tessellation unchanged or reduce it.The additional network complexity modifies the shape of the decision surface without increasing the number of linear regions. We provide a mathematical understanding of the related additional expressiveness by means of a novel measure of shape complexity by counting deviations from convexity which results in a Boolean algebraic characterization of this special class. A local representation theorem gives rise to novel approaches for pruning and decision surface analysis.
Cite
Text
Moser et al. "Tessellation-Filtering ReLU Neural Networks." International Joint Conference on Artificial Intelligence, 2022. doi:10.24963/IJCAI.2022/463Markdown
[Moser et al. "Tessellation-Filtering ReLU Neural Networks." International Joint Conference on Artificial Intelligence, 2022.](https://mlanthology.org/ijcai/2022/moser2022ijcai-tessellation/) doi:10.24963/IJCAI.2022/463BibTeX
@inproceedings{moser2022ijcai-tessellation,
title = {{Tessellation-Filtering ReLU Neural Networks}},
author = {Moser, Bernhard Alois and Lewandowski, Michal and Kargaran, Somayeh and Zellinger, Werner and Biggio, Battista and Koutschan, Christoph},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2022},
pages = {3335-3341},
doi = {10.24963/IJCAI.2022/463},
url = {https://mlanthology.org/ijcai/2022/moser2022ijcai-tessellation/}
}