Homological Neural Networks: A Sparse Architecture for Multivariate Complexity

Abstract

The rapid progress of Artificial Intelligence research came with the development of increasingly complex deep learning models, leading to growing challenges in terms of computational complexity, energy efficiency and interpretability. In this study, we apply advanced network-based information filtering techniques to design a novel deep neural network unit characterized by a sparse higher-order graphical architecture built over the homological structure of underlying data. We demonstrate its effectiveness in two application domains which are traditionally challenging for deep learning: tabular data and time series regression problems. Results demonstrate the advantages of this novel design which can tie or overcome the results of state-of-the-art machine learning and deep learning models using only a fraction of parameters.

Cite

Text

Wang et al. "Homological Neural Networks: A Sparse Architecture for Multivariate Complexity." ICML 2023 Workshops: TAGML, 2023.

Markdown

[Wang et al. "Homological Neural Networks: A Sparse Architecture for Multivariate Complexity." ICML 2023 Workshops: TAGML, 2023.](https://mlanthology.org/icmlw/2023/wang2023icmlw-homological/)

BibTeX

@inproceedings{wang2023icmlw-homological,
  title     = {{Homological Neural Networks: A Sparse Architecture for Multivariate Complexity}},
  author    = {Wang, Yuanrong and Briola, Antonio and Aste, Tomaso},
  booktitle = {ICML 2023 Workshops: TAGML},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/wang2023icmlw-homological/}
}