Efficient Fourier Neural Operators by Group Convolution and Channel Shuffling
Abstract
Fourier neural operators (FNOs) have emerged as data-driven alternatives to conventional numerical simulators for solving partial differential equations (PDEs). However, these models typically require a substantial number of learnable parameters. In this study, we explore parameter-efficient FNO architectures through modifications in their width, depth, and applications of group convolutiopn and channel shuffling. We perform benchmark on different problems on learning the operator of Maxwell's equations and Darcy flow equations. Our approach leads to significant improvement in prediction accuracy for both small and large FNO models. The proposed methods are widely adaptable across various problem types and neural operator architectures, aiming to boost prediction accuracy.
Cite
Text
Kim et al. "Efficient Fourier Neural Operators by Group Convolution and Channel Shuffling." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.Markdown
[Kim et al. "Efficient Fourier Neural Operators by Group Convolution and Channel Shuffling." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.](https://mlanthology.org/iclrw/2024/kim2024iclrw-efficient/)BibTeX
@inproceedings{kim2024iclrw-efficient,
title = {{Efficient Fourier Neural Operators by Group Convolution and Channel Shuffling}},
author = {Kim, Myungjoon and Park, Junhyung and Shin, Jonghwa},
booktitle = {ICLR 2024 Workshops: AI4DiffEqtnsInSci},
year = {2024},
url = {https://mlanthology.org/iclrw/2024/kim2024iclrw-efficient/}
}