Convergence of Manifold Filter-Combine Networks
Abstract
In order to better understand manifold neural networks (MNNs), we introduce Manifold Filter-Combine Networks (MFCNs). The filter-combine framework parallels the popular aggregate-combine paradigm for graph neural networks (GNNs) and naturally suggests many interesting families of MNNs which can be interpreted as the manifold analog of various popular GNNs. We then propose a method for implementing MFCNs on high-dimensional point clouds that relies on approximating the manifold by a sparse graph. We prove that our method is consistent in the sense that it converges to a continuum limit as the number of data points tends to infinity.
Cite
Text
Johnson et al. "Convergence of Manifold Filter-Combine Networks." NeurIPS 2024 Workshops: NeurReps, 2024.Markdown
[Johnson et al. "Convergence of Manifold Filter-Combine Networks." NeurIPS 2024 Workshops: NeurReps, 2024.](https://mlanthology.org/neuripsw/2024/johnson2024neuripsw-convergence/)BibTeX
@inproceedings{johnson2024neuripsw-convergence,
title = {{Convergence of Manifold Filter-Combine Networks}},
author = {Johnson, David R and Chew, Joyce and Viswanath, Siddharth and De Brouwer, Edward and Needell, Deanna and Krishnaswamy, Smita and Perlmutter, Michael},
booktitle = {NeurIPS 2024 Workshops: NeurReps},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/johnson2024neuripsw-convergence/}
}