Separation Power of Equivariant Neural Networks

Abstract

The separation power of a machine learning model refers to its ability to distinguish between different inputs and is often used as a proxy for its expressivity. Indeed, knowing the separation power of a family of models is a necessary condition to obtain fine-grained universality results. In this paper, we analyze the separation power of equivariant neural networks, such as convolutional and permutation-invariant networks. We first present a complete characterization of inputs indistinguishable by models derived by a given architecture. From this results, we derive how separability is influenced by hyperparameters and architectural choices—such as activation functions, depth, hidden layer width, and representation types. Notably, all non-polynomial activations, including ReLU and sigmoid, are equivalent in expressivity and reach maximum separation power. Depth improves separation power up to a threshold, after which further increases have no effect. Adding invariant features to hidden representations does not impact separation power. Finally, block decomposition of hidden representations affects separability, with minimal components forming a hierarchy in separation power that provides a straightforward method for comparing the separation power of models.

Cite

Text

Pacini et al. "Separation Power of Equivariant Neural Networks." International Conference on Learning Representations, 2025.

Markdown

[Pacini et al. "Separation Power of Equivariant Neural Networks." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/pacini2025iclr-separation/)

BibTeX

@inproceedings{pacini2025iclr-separation,
  title     = {{Separation Power of Equivariant Neural Networks}},
  author    = {Pacini, Marco and Dong, Xiaowen and Lepri, Bruno and Santin, Gabriele},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/pacini2025iclr-separation/}
}