Hyperbolic Kernel Convolution: A Generic Framework
Abstract
The past sexennium has witnessed rapid advancements of hyperbolic neural networks. However, it is challenging to learn good hyperbolic representations since common Euclidean neural operations, such as convolution, do not extend to the hyperbolic space. Most hyperbolic neural networks omit the convolution operation and cannot effectively extract local patterns. Others either only use non-hyperbolic convolution, or miss essential properties such as equivariance to permutation. We propose HKConv, a novel trainable hyperbolic convolution which first correlates trainable local hyperbolic features with fixed kernel points placed in the hyperbolic space, then aggregates the output features within a local neighborhood. HKConv is a generic framework where any coordinate model of the hyperbolic space can be flexibly used. We show that neural networks with HKConv layers advance state-of-the-art in various tasks. The code of our implementation is available at https://github.com/BruceZhangReve/Hyperbolic-Kernel-Convolution
Cite
Text
Qu et al. "Hyperbolic Kernel Convolution: A Generic Framework." Proceedings of the Third Learning on Graphs Conference, 2025.Markdown
[Qu et al. "Hyperbolic Kernel Convolution: A Generic Framework." Proceedings of the Third Learning on Graphs Conference, 2025.](https://mlanthology.org/log/2025/qu2025log-hyperbolic/)BibTeX
@inproceedings{qu2025log-hyperbolic,
title = {{Hyperbolic Kernel Convolution: A Generic Framework}},
author = {Qu, Eric and Zhang, Lige and Debaya, Habib and Wu, Yue and Zou, Dongmian},
booktitle = {Proceedings of the Third Learning on Graphs Conference},
year = {2025},
pages = {25:1-25:25},
volume = {269},
url = {https://mlanthology.org/log/2025/qu2025log-hyperbolic/}
}