Lorentz Direct Concatenation for Stable Training in Hyperbolic Neural Networks
Abstract
Hyperbolic neural networks have achieved considerable success in extracting representation from hierarchical or tree-like data. However, they are known to suffer from numerical instability, which makes it difficult to build hyperbolic neural networks with deep hyperbolic layers, no matter whether the Poincaré or Lorentz coordinate system is used. In this note, we study the crucial operation of concatenating hyperbolic representations. We propose the Lorentz direct concatenation and illustrate that it is much more stable than concatenating in the tangent space. We provide some insights and show superiority of performing direct concatenation in real tasks.
Cite
Text
Qu and Zou. "Lorentz Direct Concatenation for Stable Training in Hyperbolic Neural Networks." NeurIPS 2022 Workshops: NeurReps, 2022.Markdown
[Qu and Zou. "Lorentz Direct Concatenation for Stable Training in Hyperbolic Neural Networks." NeurIPS 2022 Workshops: NeurReps, 2022.](https://mlanthology.org/neuripsw/2022/qu2022neuripsw-lorentz/)BibTeX
@inproceedings{qu2022neuripsw-lorentz,
title = {{Lorentz Direct Concatenation for Stable Training in Hyperbolic Neural Networks}},
author = {Qu, Eric and Zou, Dongmian},
booktitle = {NeurIPS 2022 Workshops: NeurReps},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/qu2022neuripsw-lorentz/}
}