Approximation Error of Sobolev Regular Functions with Tanh Neural Networks: Theoretical Impact on PINNs

Abstract

Considering the key role played by derivatives in Partial Differential Equations (PDEs), using the tanh activation function in Physics-Informed Neural Networks (PINNs) yields useful smoothness properties to derive theoretical guarantees in Sobolev norm. In this paper, we conduct an extensive functional analysis, unveiling tighter approximation bounds compared to prior works, especially for higher order PDEs. These better guarantees translate into smaller PINN architectures and improved generalization error with arbitrarily small Sobolev norms of the PDE residuals.

Cite

Text

Girault et al. "Approximation Error of Sobolev Regular Functions with Tanh Neural Networks: Theoretical Impact on PINNs." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2024. doi:10.1007/978-3-031-70359-1_16

Markdown

[Girault et al. "Approximation Error of Sobolev Regular Functions with Tanh Neural Networks: Theoretical Impact on PINNs." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2024.](https://mlanthology.org/ecmlpkdd/2024/girault2024ecmlpkdd-approximation/) doi:10.1007/978-3-031-70359-1_16

BibTeX

@inproceedings{girault2024ecmlpkdd-approximation,
  title     = {{Approximation Error of Sobolev Regular Functions with Tanh Neural Networks: Theoretical Impact on PINNs}},
  author    = {Girault, Benjamin and Emonet, Rémi and Habrard, Amaury and Patracone, Jordan and Sebban, Marc},
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2024},
  pages     = {266-282},
  doi       = {10.1007/978-3-031-70359-1_16},
  url       = {https://mlanthology.org/ecmlpkdd/2024/girault2024ecmlpkdd-approximation/}
}