Representation Tradeoffs for Hyperbolic Embeddings
Abstract
Hyperbolic embeddings offer excellent quality with few dimensions when embedding hierarchical data structures. We give a combinatorial construction that embeds trees into hyperbolic space with arbitrarily low distortion without optimization. On WordNet, this algorithm obtains a mean-average-precision of 0.989 with only two dimensions, outperforming existing work by 0.11 points. We provide bounds characterizing the precision-dimensionality tradeoff inherent in any hyperbolic embedding. To embed general metric spaces, we propose a hyperbolic generalization of multidimensional scaling (h-MDS). We show how to perform exact recovery of hyperbolic points from distances, provide a perturbation analysis, and give a recovery result that enables us to reduce dimensionality. Finally, we extract lessons from the algorithms and theory above to design a scalable PyTorch-based implementation that can handle incomplete information.
Cite
Text
Sala et al. "Representation Tradeoffs for Hyperbolic Embeddings." International Conference on Machine Learning, 2018.Markdown
[Sala et al. "Representation Tradeoffs for Hyperbolic Embeddings." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/sala2018icml-representation/)BibTeX
@inproceedings{sala2018icml-representation,
title = {{Representation Tradeoffs for Hyperbolic Embeddings}},
author = {Sala, Frederic and De Sa, Chris and Gu, Albert and Re, Christopher},
booktitle = {International Conference on Machine Learning},
year = {2018},
pages = {4460-4469},
volume = {80},
url = {https://mlanthology.org/icml/2018/sala2018icml-representation/}
}