How Jellyfish Characterise Alternating Group Equivariant Neural Networks
Abstract
We provide a full characterisation of all of the possible alternating group ($A_n$) equivariant neural networks whose layers are some tensor power of $\mathbb{R}^{n}$. In particular, we find a basis of matrices for the learnable, linear, $A_n$–equivariant layer functions between such tensor power spaces in the standard basis of $\mathbb{R}^{n}$. We also describe how our approach generalises to the construction of neural networks that are equivariant to local symmetries.
Cite
Text
Pearce-Crump. "How Jellyfish Characterise Alternating Group Equivariant Neural Networks." International Conference on Machine Learning, 2023.Markdown
[Pearce-Crump. "How Jellyfish Characterise Alternating Group Equivariant Neural Networks." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/pearcecrump2023icml-jellyfish/)BibTeX
@inproceedings{pearcecrump2023icml-jellyfish,
title = {{How Jellyfish Characterise Alternating Group Equivariant Neural Networks}},
author = {Pearce-Crump, Edward},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {27483-27495},
volume = {202},
url = {https://mlanthology.org/icml/2023/pearcecrump2023icml-jellyfish/}
}