Good Lattice Accelerates Physics-Informed Neural Networks

Abstract

Physics-informed neural networks (PINNs) can solve partial differential equations (PDEs) by minimizing the physics-informed loss, ensuring the neural network satisfies the PDE at given points. However, the solutions to a PDE are infinite-dimensional, and the physics-informed loss is a finite approximation to a certain integral over the domain. This indicates that selecting appropriate points is essential. This paper proposes "good lattice training" (GLT), a technique inspired by number theoretic methods. GLT provides an optimal set of collocation points and can train PINNs to achieve competitive performance with smaller computational cost

Cite

Text

Matsubara and Yaguchi. "Good Lattice Accelerates Physics-Informed Neural Networks." ICML 2023 Workshops: SynS_and_ML, 2023.

Markdown

[Matsubara and Yaguchi. "Good Lattice Accelerates Physics-Informed Neural Networks." ICML 2023 Workshops: SynS_and_ML, 2023.](https://mlanthology.org/icmlw/2023/matsubara2023icmlw-good/)

BibTeX

@inproceedings{matsubara2023icmlw-good,
  title     = {{Good Lattice Accelerates Physics-Informed Neural Networks}},
  author    = {Matsubara, Takashi and Yaguchi, Takaharu},
  booktitle = {ICML 2023 Workshops: SynS_and_ML},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/matsubara2023icmlw-good/}
}