Neural Splines: Fitting 3D Surfaces with Infinitely-Wide Neural Networks
Abstract
We present Neural Splines, a technique for 3D surface reconstruction that is based on random feature kernels arising from infinitely-wide shallow ReLU networks. Our method achieves state-of-the-art results, outperforming recent neural network-based techniques and widely used Poisson Surface Reconstruction (which, as we demonstrate, can also be viewed as a type of kernel method). Because our approach is based on a simple kernel formulation, it is easy to analyze and can be accelerated by general techniques designed for kernel-based learning. We provide explicit analytical expressions for our kernel and argue that our formulation can be seen as a generalization of cubic spline interpolation to higher dimensions. In particular, the RKHS norm associated with Neural Splines biases toward smooth interpolants.
Cite
Text
Williams et al. "Neural Splines: Fitting 3D Surfaces with Infinitely-Wide Neural Networks." Conference on Computer Vision and Pattern Recognition, 2021. doi:10.1109/CVPR46437.2021.00982Markdown
[Williams et al. "Neural Splines: Fitting 3D Surfaces with Infinitely-Wide Neural Networks." Conference on Computer Vision and Pattern Recognition, 2021.](https://mlanthology.org/cvpr/2021/williams2021cvpr-neural/) doi:10.1109/CVPR46437.2021.00982BibTeX
@inproceedings{williams2021cvpr-neural,
title = {{Neural Splines: Fitting 3D Surfaces with Infinitely-Wide Neural Networks}},
author = {Williams, Francis and Trager, Matthew and Bruna, Joan and Zorin, Denis},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2021},
pages = {9949-9958},
doi = {10.1109/CVPR46437.2021.00982},
url = {https://mlanthology.org/cvpr/2021/williams2021cvpr-neural/}
}