Towards Architectural Optimization of Equivariant Neural Networks over Subgroups

Abstract

Incorporating equivariance to symmetry groups in artificial neural networks (ANNs) can improve performance on tasks exhibiting those symmetries, but such symmetries are often only approximate and not explicitly known. This motivates algorithmically optimizing the architectural constraints imposed by equivariance. We propose the equivariance relaxation morphism, which preserves functionality while reparameterizing a group equivariant layer to operate with equivariance constraints on a subgroup, and the $[G]$-mixed equivariant layer, which mixes operations constrained to equivariance to different groups to enable within-layer equivariance optimization. These two architectural tools can be used within neural architecture search (NAS) algorithms for equivariance-aware architectural optimization.

Cite

Text

Maile et al. "Towards Architectural Optimization of Equivariant Neural Networks over Subgroups." NeurIPS 2022 Workshops: NeurReps, 2022.

Markdown

[Maile et al. "Towards Architectural Optimization of Equivariant Neural Networks over Subgroups." NeurIPS 2022 Workshops: NeurReps, 2022.](https://mlanthology.org/neuripsw/2022/maile2022neuripsw-architectural/)

BibTeX

@inproceedings{maile2022neuripsw-architectural,
  title     = {{Towards Architectural Optimization of Equivariant Neural Networks over Subgroups}},
  author    = {Maile, Kaitlin and Wilson, Dennis George and Forré, Patrick},
  booktitle = {NeurIPS 2022 Workshops: NeurReps},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/maile2022neuripsw-architectural/}
}