On the Impact of Larger Batch Size in the Training of Physics Informed Neural Networks

Abstract

Physics Informed Neural Networks (PINNs) have demonstrated remarkable success in learning complex physical processes such as shocks and turbulence, but their applicability has been limited due to long training times. In this work, we explore the potential of large batch size training to save training time and improve final accuracy in PINNs. We show that conclusions about generalization gap brought by large batch size training on image classification tasks may not be compatible with PINNs. We conclude that larger batch sizes always beneficial to training PINNs.

Cite

Text

Sankaran et al. "On the Impact of Larger Batch Size in the Training of Physics Informed Neural Networks." NeurIPS 2022 Workshops: DLDE, 2022.

Markdown

[Sankaran et al. "On the Impact of Larger Batch Size in the Training of Physics Informed Neural Networks." NeurIPS 2022 Workshops: DLDE, 2022.](https://mlanthology.org/neuripsw/2022/sankaran2022neuripsw-impact/)

BibTeX

@inproceedings{sankaran2022neuripsw-impact,
  title     = {{On the Impact of Larger Batch Size in the Training of Physics Informed Neural Networks}},
  author    = {Sankaran, Shyam and Wang, Hanwen and Guilhoto, Leonardo Ferreira and Perdikaris, Paris},
  booktitle = {NeurIPS 2022 Workshops: DLDE},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/sankaran2022neuripsw-impact/}
}