From Activation to Initialization: Scaling Insights for Optimizing Neural Fields
Abstract
In the realm of computer vision Neural Fields have gained prominence as a contemporary tool harnessing neural networks for signal representation. Despite the remarkable progress in adapting these networks to solve a variety of problems the field still lacks a comprehensive theoretical framework. This article aims to address this gap by delving into the intricate interplay between initialization and activation providing a foundational basis for the robust optimization of Neural Fields. Our theoretical insights reveal a deep-seated connection among network initialization architectural choices and the optimization process emphasizing the need for a holistic approach when designing cutting-edge Neural Fields.
Cite
Text
Saratchandran et al. "From Activation to Initialization: Scaling Insights for Optimizing Neural Fields." Conference on Computer Vision and Pattern Recognition, 2024. doi:10.1109/CVPR52733.2024.00047Markdown
[Saratchandran et al. "From Activation to Initialization: Scaling Insights for Optimizing Neural Fields." Conference on Computer Vision and Pattern Recognition, 2024.](https://mlanthology.org/cvpr/2024/saratchandran2024cvpr-activation/) doi:10.1109/CVPR52733.2024.00047BibTeX
@inproceedings{saratchandran2024cvpr-activation,
title = {{From Activation to Initialization: Scaling Insights for Optimizing Neural Fields}},
author = {Saratchandran, Hemanth and Ramasinghe, Sameera and Lucey, Simon},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2024},
pages = {413-422},
doi = {10.1109/CVPR52733.2024.00047},
url = {https://mlanthology.org/cvpr/2024/saratchandran2024cvpr-activation/}
}