Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics
Abstract
Extracting scientific understanding from particle-physics experiments requires solving diverse learning problems with high precision and good data efficiency. We propose the Lorentz Geometric Algebra Transformer (L-GATr), a new multi-purpose architecture for high-energy physics. L-GATr represents high-energy data in a geometric algebra over four-dimensional space-time and is equivariant under Lorentz transformations, the symmetry group of relativistic kinematics. At the same time, the architecture is a Transformer, which makes it versatile and scalable to large systems. L-GATr is first demonstrated on regression and classification tasks from particle physics. We then construct the first Lorentz-equivariant generative model: a continuous normalizing flow based on an L-GATr network, trained with Riemannian flow matching. Across our experiments, L-GATr is on par with or outperforms strong domain-specific baselines.
Cite
Text
Spinner et al. "Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics." Neural Information Processing Systems, 2024. doi:10.52202/079017-0699Markdown
[Spinner et al. "Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/spinner2024neurips-lorentzequivariant/) doi:10.52202/079017-0699BibTeX
@inproceedings{spinner2024neurips-lorentzequivariant,
title = {{Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics}},
author = {Spinner, Jonas and Bresó, Victor and de Haan, Pim and Plehn, Tilman and Thaler, Jesse and Brehmer, Johann},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-0699},
url = {https://mlanthology.org/neurips/2024/spinner2024neurips-lorentzequivariant/}
}