Permutation Tree Invariant Neural Architectures

Abstract

Exploiting symmetry as an inductive bias has become a fundamental technique in deep learning to improve generalization and sample efficiency. We investigate the design of models that are invariant to subgroups of the symmetric group defined by hierarchical structures. We propose permutation trees, which represent permutations by the ordering of their leaves and allow the reordering of siblings depending on the type of their parent, generalizing PQ-trees. We characterize the permutation trees that represent permutation groups and derive invariant neural architectures from them in a bottom-up fashion. We show that our approach learns faster with less data and achieves an improved prediction performance on a synthetic dataset.

Cite

Text

Urban et al. "Permutation Tree Invariant Neural Architectures." ICML 2024 Workshops: GRaM, 2024.

Markdown

[Urban et al. "Permutation Tree Invariant Neural Architectures." ICML 2024 Workshops: GRaM, 2024.](https://mlanthology.org/icmlw/2024/urban2024icmlw-permutation/)

BibTeX

@inproceedings{urban2024icmlw-permutation,
  title     = {{Permutation Tree Invariant Neural Architectures}},
  author    = {Urban, Johannes and Tschiatschek, Sebastian and Kriege, Nils Morten},
  booktitle = {ICML 2024 Workshops: GRaM},
  year      = {2024},
  url       = {https://mlanthology.org/icmlw/2024/urban2024icmlw-permutation/}
}