Neural Networks on Symmetric Spaces of Noncompact Type
Abstract
Recent works have demonstrated promising performances of neural networks on hyperbolic spaces and symmetric positive definite (SPD) manifolds. These spaces belong to a family of Riemannian manifolds referred to as symmetric spaces of noncompact type. In this paper, we propose a novel approach for developing neural networks on such spaces. Our approach relies on a unified formulation of the distance from a point to a hyperplane on the considered spaces. We show that some existing formulations of the point-to-hyperplane distance can be recovered by our approach under specific settings. Furthermore, we derive a closed-form expression for the point-to-hyperplane distance in higher-rank symmetric spaces of noncompact type equipped with G-invariant Riemannian metrics. The derived distance then serves as a tool to design fully-connected (FC) layers and an attention mechanism for neural networks on the considered spaces. Our approach is validated on challenging benchmarks for image classification, electroencephalogram (EEG) signal classification, image generation, and natural language inference.
Cite
Text
Nguyen et al. "Neural Networks on Symmetric Spaces of Noncompact Type." International Conference on Learning Representations, 2025.Markdown
[Nguyen et al. "Neural Networks on Symmetric Spaces of Noncompact Type." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/nguyen2025iclr-neural/)BibTeX
@inproceedings{nguyen2025iclr-neural,
title = {{Neural Networks on Symmetric Spaces of Noncompact Type}},
author = {Nguyen, Xuan Son and Yang, Shuo and Histace, Aymeric},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/nguyen2025iclr-neural/}
}