Ultrahyperbolic Representation Learning
Abstract
In machine learning, data is usually represented in a (flat) Euclidean space where distances between points are along straight lines. Researchers have recently considered more exotic (non-Euclidean) Riemannian manifolds such as hyperbolic space which is well suited for tree-like data. In this paper, we propose a representation living on a pseudo-Riemannian manifold of constant nonzero curvature. It is a generalization of hyperbolic and spherical geometries where the non-degenerate metric tensor need not be positive definite. We provide the necessary learning tools in this geometry and extend gradient method optimization techniques. More specifically, we provide closed-form expressions for distances via geodesics and define a descent direction to minimize some objective function. Our novel framework is applied to graph representations.
Cite
Text
Law and Stam. "Ultrahyperbolic Representation Learning." Neural Information Processing Systems, 2020.Markdown
[Law and Stam. "Ultrahyperbolic Representation Learning." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/law2020neurips-ultrahyperbolic/)BibTeX
@inproceedings{law2020neurips-ultrahyperbolic,
title = {{Ultrahyperbolic Representation Learning}},
author = {Law, Marc and Stam, Jos},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/law2020neurips-ultrahyperbolic/}
}