SAPE: Spatially-Adaptive Progressive Encoding for Neural Optimization

Abstract

Multilayer-perceptrons (MLP) are known to struggle learning functions of high-frequencies, and in particular, instances of wide frequency bands.We present a progressive mapping scheme for input signals of MLP networks, enabling them to better fit a wide range of frequencies without sacrificing training stability or requiring any domain specific preprocessing. We introduce Spatially Adaptive Progressive Encoding (SAPE) layers, which gradually unmask signal components with increasing frequencies as a function of time and space. The progressive exposure of frequencies is monitored by a feedback loop throughout the neural optimization process, allowing changes to propagate at different rates among local spatial portions of the signal space. We demonstrate the advantage of our method on variety of domains and applications: regression of low dimensional signals and images, representation learning of occupancy networks, and a geometric task of mesh transfer between 3D shapes.

Cite

Text

Hertz et al. "SAPE: Spatially-Adaptive Progressive Encoding for Neural Optimization." Neural Information Processing Systems, 2021.

Markdown

[Hertz et al. "SAPE: Spatially-Adaptive Progressive Encoding for Neural Optimization." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/hertz2021neurips-sape/)

BibTeX

@inproceedings{hertz2021neurips-sape,
  title     = {{SAPE: Spatially-Adaptive Progressive Encoding for Neural Optimization}},
  author    = {Hertz, Amir and Perel, Or and Giryes, Raja and Sorkine-hornung, Olga and Cohen-or, Daniel},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/hertz2021neurips-sape/}
}