Machines of Finite Depth: Towards a Formalization of Neural Networks

Abstract

We provide a unifying framework where artificial neural networks and their architectures can be formally described as particular cases of a general mathematical construction---machines of finite depth. Unlike neural networks, machines have a precise definition, from which several properties follow naturally. Machines of finite depth are modular (they can be combined), efficiently computable, and differentiable. The backward pass of a machine is again a machine and can be computed without overhead using the same procedure as the forward pass. We prove this statement theoretically and practically via a unified implementation that generalizes several classical architectures---dense, convolutional, and recurrent neural networks with a rich shortcut structure---and their respective backpropagation rules.

Cite

Text

Vertechi and Bergomi. "Machines of Finite Depth: Towards a Formalization of Neural Networks." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I8.26199

Markdown

[Vertechi and Bergomi. "Machines of Finite Depth: Towards a Formalization of Neural Networks." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/vertechi2023aaai-machines/) doi:10.1609/AAAI.V37I8.26199

BibTeX

@inproceedings{vertechi2023aaai-machines,
  title     = {{Machines of Finite Depth: Towards a Formalization of Neural Networks}},
  author    = {Vertechi, Pietro and Bergomi, Mattia G.},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {10061-10068},
  doi       = {10.1609/AAAI.V37I8.26199},
  url       = {https://mlanthology.org/aaai/2023/vertechi2023aaai-machines/}
}