Differentiating Through the Fréchet Mean
Abstract
Recent advances in deep representation learning on Riemannian manifolds extend classical deep learning operations to better capture the geometry of the manifold. One possible extension is the Fr{é}chet mean, the generalization of the Euclidean mean; however, it has been difficult to apply because it lacks a closed form with an easily computable derivative. In this paper, we show how to differentiate through the Fr{é}chet mean for arbitrary Riemannian manifolds. Then, focusing on hyperbolic space, we derive explicit gradient expressions and a fast, accurate, and hyperparameter-free Fr{é}chet mean solver. This fully integrates the Fr{é}chet mean into the hyperbolic neural network pipeline. To demonstrate this integration, we present two case studies. First, we apply our Fr{é}chet mean to the existing Hyperbolic Graph Convolutional Network, replacing its projected aggregation to obtain state-of-the-art results on datasets with high hyperbolicity. Second, to demonstrate the Fr{é}chet mean’s capacity to generalize Euclidean neural network operations, we develop a hyperbolic batch normalization method that gives an improvement parallel to the one observed in the Euclidean setting.
Cite
Text
Lou et al. "Differentiating Through the Fréchet Mean." International Conference on Machine Learning, 2020.Markdown
[Lou et al. "Differentiating Through the Fréchet Mean." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/lou2020icml-differentiating/)BibTeX
@inproceedings{lou2020icml-differentiating,
title = {{Differentiating Through the Fréchet Mean}},
author = {Lou, Aaron and Katsman, Isay and Jiang, Qingxuan and Belongie, Serge and Lim, Ser-Nam and De Sa, Christopher},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {6393-6403},
volume = {119},
url = {https://mlanthology.org/icml/2020/lou2020icml-differentiating/}
}