Convex Relaxation for Solving Large-Margin Classifiers in Hyperbolic Space

Abstract

Hyperbolic spaces have increasingly been recognized for their outstanding performance in handling data with inherent hierarchical structures compared to their Euclidean counterparts. However, learning in hyperbolic spaces poses significant challenges. In particular, extending support vector machines to hyperbolic spaces is in general a constrained non-convex optimization problem. Previous and popular attempts to solve hyperbolic SVMs, primarily using projected gradient descent, are generally sensitive to hyperparameters and initializations, often leading to suboptimal solutions. In this work, by first rewriting the problem into a polynomial optimization, we apply semidefinite relaxation and sparse moment-sum-of-squares relaxation to effectively approximate the optima. From extensive empirical experiments, these methods are shown to achieve better classification accuracies than the projected gradient descent approach in most of the synthetic and real two-dimensional hyperbolic embedding dataset under the one-vs-rest multiclass-classification scheme.

Cite

Text

Yang et al. "Convex Relaxation for Solving Large-Margin Classifiers in Hyperbolic Space." Transactions on Machine Learning Research, 2025.

Markdown

[Yang et al. "Convex Relaxation for Solving Large-Margin Classifiers in Hyperbolic Space." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/yang2025tmlr-convex/)

BibTeX

@article{yang2025tmlr-convex,
  title     = {{Convex Relaxation for Solving Large-Margin Classifiers in Hyperbolic Space}},
  author    = {Yang, Sheng and Liu, Peihan and Pehlevan, Cengiz},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/yang2025tmlr-convex/}
}