GLGENN: A Novel Parameter-Light Equivariant Neural Networks Architecture Based on Clifford Geometric Algebras
Abstract
We propose, implement, and compare with competitors a new architecture of equivariant neural networks based on geometric (Clifford) algebras: Generalized Lipschitz Group Equivariant Neural Networks (GLGENN). These networks are equivariant to all pseudo-orthogonal transformations, including rotations and reflections, of a vector space with any non-degenerate or degenerate symmetric bilinear form. We propose a weight-sharing parametrization technique that takes into account the fundamental structures and operations of geometric algebras. Due to this technique, GLGENN architecture is parameter-light and has less tendency to overfitting than baseline equivariant models. GLGENN outperforms or matches competitors on several benchmarking equivariant tasks, including estimation of an equivariant function and a convex hull experiment, while using significantly fewer optimizable parameters.
Cite
Text
Filimoshina and Shirokov. "GLGENN: A Novel Parameter-Light Equivariant Neural Networks Architecture Based on Clifford Geometric Algebras." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Filimoshina and Shirokov. "GLGENN: A Novel Parameter-Light Equivariant Neural Networks Architecture Based on Clifford Geometric Algebras." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/filimoshina2025icml-glgenn/)BibTeX
@inproceedings{filimoshina2025icml-glgenn,
title = {{GLGENN: A Novel Parameter-Light Equivariant Neural Networks Architecture Based on Clifford Geometric Algebras}},
author = {Filimoshina, Ekaterina and Shirokov, Dmitry},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {17153-17188},
volume = {267},
url = {https://mlanthology.org/icml/2025/filimoshina2025icml-glgenn/}
}