On the Universality of Invariant Networks

Abstract

Constraining linear layers in neural networks to respect symmetry transformations from a group $G$ is a common design principle for invariant networks that has found many applications in machine learning. In this paper, we consider a fundamental question that has received very little attention to date: Can these networks approximate any (continuous) invariant function? We tackle the rather general case where $G\leq S_n$ (an arbitrary subgroup of the symmetric group) that acts on $\R^n$ by permuting coordinates. This setting includes several recent popular invariant networks. We present two main results: First, $G$-invariant networks are universal if high-order tensors are allowed. Second, there are groups $G$ for which higher-order tensors are unavoidable for obtaining universality. $G$-invariant networks consisting of only first-order tensors are of special interest due to their practical value. We conclude the paper by proving a necessary condition for the universality of $G$-invariant networks that incorporate only first-order tensors. Lastly, we propose a conjecture stating that this condition is also sufficient.

Cite

Text

Maron et al. "On the Universality of Invariant Networks." International Conference on Machine Learning, 2019.

Markdown

[Maron et al. "On the Universality of Invariant Networks." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/maron2019icml-universality/)

BibTeX

@inproceedings{maron2019icml-universality,
  title     = {{On the Universality of Invariant Networks}},
  author    = {Maron, Haggai and Fetaya, Ethan and Segol, Nimrod and Lipman, Yaron},
  booktitle = {International Conference on Machine Learning},
  year      = {2019},
  pages     = {4363-4371},
  volume    = {97},
  url       = {https://mlanthology.org/icml/2019/maron2019icml-universality/}
}