Maximal Update Parametrization and Zero-Shot Hyperparameter Transfer for Fourier Neural Operators

Abstract

Fourier Neural Operators (FNOs) offer a principled approach for solving complex partial differential equations (PDEs). However, scaling them to handle more complex PDEs requires increasing the number of Fourier modes, which significantly expands the number of model parameters and makes hyperparameter tuning computationally impractical. To address this, we introduce $\mu$Transfer-FNO, a zero-shot hyperparameter transfer technique that enables optimal configurations, tuned on smaller FNOs, to be directly applied to billion-parameter FNOs without additional tuning. Building on the Maximal Update Parametrization ($\mu$P) framework, we mathematically derive a parametrization scheme that facilitates the transfer of optimal hyperparameters across models with different numbers of Fourier modes in FNOs, which is validated through extensive experiments on various PDEs. Our empirical study shows that $\mu$Transfer-FNO reduces computational cost for tuning hyperparameters on large FNOs while maintaining or improving accuracy.

Cite

Text

Li et al. "Maximal Update Parametrization and Zero-Shot Hyperparameter Transfer for Fourier Neural Operators." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Li et al. "Maximal Update Parametrization and Zero-Shot Hyperparameter Transfer for Fourier Neural Operators." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/li2025icml-maximal/)

BibTeX

@inproceedings{li2025icml-maximal,
  title     = {{Maximal Update Parametrization and Zero-Shot Hyperparameter Transfer for Fourier Neural Operators}},
  author    = {Li, Shanda and Yoo, Shinjae and Yang, Yiming},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {36707-36721},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/li2025icml-maximal/}
}