ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homology
Abstract
A ReLU neural network leads to a finite polyhedral decomposition of input space and a corresponding finite dual graph. We show that while this dual graph is a coarse quantization of input space, it is sufficiently robust that it can be combined with persistent homology to detect homological signals of manifolds in the input space from samples. This property holds for a wide range of networks trained for a wide range of purposes that have nothing to do with this topological application. We found this feature to be surprising and interesting; we hope it will also be useful.
Cite
Text
Liu et al. "ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homology." ICML 2023 Workshops: TAGML, 2023.Markdown
[Liu et al. "ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homology." ICML 2023 Workshops: TAGML, 2023.](https://mlanthology.org/icmlw/2023/liu2023icmlw-relu/)BibTeX
@inproceedings{liu2023icmlw-relu,
title = {{ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homology}},
author = {Liu, Yajing and Cole, Christina M and Peterson, Chris and Kirby, Michael},
booktitle = {ICML 2023 Workshops: TAGML},
year = {2023},
url = {https://mlanthology.org/icmlw/2023/liu2023icmlw-relu/}
}