The Conjugate Kernel for Efficient Training of Physics-Informed Deep Operator Networks
Abstract
Recent work has shown that the empirical Neural Tangent Kernel (NTK) can significantly improve the training of physics-informed Deep Operator Networks (DeepONets). The NTK, however, is costly to calculate, greatly increasing the cost of training such systems. In this paper, we study the performance of the empirical Conjugate Kernel (CK) for physics-informed DeepONets, an efficient approximation to the NTK that has been observed to yield similar results. For physics-informed DeepONets, we show that the CK performance is comparable to the NTK, while significantly reducing the time complexity for training DeepONets with the NTK.
Cite
Text
Howard et al. "The Conjugate Kernel for Efficient Training of Physics-Informed Deep Operator Networks." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.Markdown
[Howard et al. "The Conjugate Kernel for Efficient Training of Physics-Informed Deep Operator Networks." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.](https://mlanthology.org/iclrw/2024/howard2024iclrw-conjugate/)BibTeX
@inproceedings{howard2024iclrw-conjugate,
title = {{The Conjugate Kernel for Efficient Training of Physics-Informed Deep Operator Networks}},
author = {Howard, Amanda A and Qadeer, Saad and Engel, Andrew William and Tsou, Adam and Vargas, Max and Chiang, Tony and Stinis, Panos},
booktitle = {ICLR 2024 Workshops: AI4DiffEqtnsInSci},
year = {2024},
url = {https://mlanthology.org/iclrw/2024/howard2024iclrw-conjugate/}
}