Amortized Fourier Neural Operators

Abstract

Fourier Neural Operators (FNOs) have shown promise for solving partial differential equations (PDEs).Typically, FNOs employ separate parameters for different frequency modes to specify tunable kernel integrals in Fourier space, which, yet, results in an undesirably large number of parameters when solving high-dimensional PDEs. A workaround is to abandon the frequency modes exceeding a predefined threshold, but this limits the FNOs' ability to represent high-frequency details and poses non-trivial challenges for hyper-parameter specification. To address these, we propose AMortized Fourier Neural Operator (AM-FNO), where an amortized neural parameterization of the kernel function is deployed to accommodate arbitrarily many frequency modes using a fixed number of parameters. We introduce two implementations of AM-FNO, based on the recently developed, appealing Kolmogorov–Arnold Network (KAN) and Multi-Layer Perceptrons (MLPs) equipped with orthogonal embedding functions respectively. We extensively evaluate our method on diverse datasets from various domains and observe up to 31\% average improvement compared to competing neural operator baselines.

Cite

Text

Xiao et al. "Amortized Fourier Neural Operators." Neural Information Processing Systems, 2024. doi:10.52202/079017-3651

Markdown

[Xiao et al. "Amortized Fourier Neural Operators." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/xiao2024neurips-amortized/) doi:10.52202/079017-3651

BibTeX

@inproceedings{xiao2024neurips-amortized,
  title     = {{Amortized Fourier Neural Operators}},
  author    = {Xiao, Zipeng and Kou, Siqi and Hao, Zhongkai and Lin, Bokai and Deng, Zhijie},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-3651},
  url       = {https://mlanthology.org/neurips/2024/xiao2024neurips-amortized/}
}