Hyperbolic Embeddings of Supervised Models

Abstract

Models of hyperbolic geometry have been successfully used in ML for two main tasks: embedding models in unsupervised learning (e.g. hierarchies) and embedding data. To our knowledge, there are no approaches that provide embeddings for supervised models; even when hyperbolic geometry provides convenient properties for expressing popular hypothesis classes, such as decision trees (and ensembles).In this paper, we propose a full-fledged solution to the problem in three independent contributions. The first linking the theory of losses for class probability estimation to hyperbolic embeddings in Poincar\'e disk model. The second resolving an issue for a clean, unambiguous embedding of (ensembles of) decision trees in this model. The third showing how to smoothly tweak the Poincar\'e hyperbolic distance to improve its encoding and visualization properties near the border of the disk, a crucial region for our application, while keeping hyperbolicity.This last step has substantial independent interest as it is grounded in a generalization of Leibniz-Newton's fundamental Theorem of calculus.

Cite

Text

Nock et al. "Hyperbolic Embeddings of Supervised Models." Neural Information Processing Systems, 2024. doi:10.52202/079017-4472

Markdown

[Nock et al. "Hyperbolic Embeddings of Supervised Models." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/nock2024neurips-hyperbolic/) doi:10.52202/079017-4472

BibTeX

@inproceedings{nock2024neurips-hyperbolic,
  title     = {{Hyperbolic Embeddings of Supervised Models}},
  author    = {Nock, Richard and Amid, Ehsan and Nielsen, Frank and Soen, Alexander and Warmuth, Manfred K.},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-4472},
  url       = {https://mlanthology.org/neurips/2024/nock2024neurips-hyperbolic/}
}