SepONet: Efficient Large-Scale Physics-Informed Operator Learning
Abstract
We introduce Separable Operator Networks (SepONet), a novel framework that significantly enhances the efficiency of physics-informed operator learning. SepONet uses independent trunk networks to learn basis functions separately for different coordinate axes, enabling faster and more memory-efficient training via forward-mode automatic differentiation. We provide a universal approximation theorem for SepONet proving that it generalizes to arbitrary operator learning problems, and then validate its performance through comprehensive benchmarking against physics-informed DeepONet. Our results demonstrate SepONet's superior performance across various nonlinear and inseparable PDEs, with SepONet's advantages increasing with problem complexity, dimension, and scale. Open source code is available at https://github.com/HewlettPackard/separable-operator-networks.
Cite
Text
Yu et al. "SepONet: Efficient Large-Scale Physics-Informed Operator Learning." NeurIPS 2024 Workshops: D3S3, 2024.Markdown
[Yu et al. "SepONet: Efficient Large-Scale Physics-Informed Operator Learning." NeurIPS 2024 Workshops: D3S3, 2024.](https://mlanthology.org/neuripsw/2024/yu2024neuripsw-seponet/)BibTeX
@inproceedings{yu2024neuripsw-seponet,
title = {{SepONet: Efficient Large-Scale Physics-Informed Operator Learning}},
author = {Yu, Xinling and Hooten, Sean and Liu, Ziyue and Zhao, Yequan and Fiorentino, Marco and Van Vaerenbergh, Thomas and Zhang, Zheng},
booktitle = {NeurIPS 2024 Workshops: D3S3},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/yu2024neuripsw-seponet/}
}