Deep Operator Learning Lessens the Curse of Dimensionality for PDEs

Abstract

Deep neural networks (DNNs) have achieved remarkable success in numerous domains, and their application to PDE-related problems has been rapidly advancing. This paper provides an estimate for the generalization error of learning Lipschitz operators over Banach spaces using DNNs with applications to various PDE solution operators. The goal is to specify DNN width, depth, and the number of training samples needed to guarantee a certain testing error. Under mild assumptions on data distributions or operator structures, our analysis shows that deep operator learning can have a relaxed dependence on the discretization resolution of PDEs and, hence, lessen the curse of dimensionality in many PDE-related problems including elliptic equations, parabolic equations, and Burgers equations. Our results are also applied to give insights about discretization-invariant in operator learning.

Cite

Text

Chen et al. "Deep Operator Learning Lessens the Curse of Dimensionality for PDEs." Transactions on Machine Learning Research, 2023.

Markdown

[Chen et al. "Deep Operator Learning Lessens the Curse of Dimensionality for PDEs." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/chen2023tmlr-deep/)

BibTeX

@article{chen2023tmlr-deep,
  title     = {{Deep Operator Learning Lessens the Curse of Dimensionality for PDEs}},
  author    = {Chen, Ke and Wang, Chunmei and Yang, Haizhao},
  journal   = {Transactions on Machine Learning Research},
  year      = {2023},
  url       = {https://mlanthology.org/tmlr/2023/chen2023tmlr-deep/}
}