Operator Learning with PCA-Net: Upper and Lower Complexity Bounds

Abstract

PCA-Net is a recently proposed neural operator architecture which combines principal component analysis (PCA) with neural networks to approximate operators between infinite-dimensional function spaces. The present work develops approximation theory for this approach, improving and significantly extending previous work in this direction: First, a novel universal approximation result is derived, under minimal assumptions on the underlying operator and the data-generating distribution. Then, two potential obstacles to efficient operator learning with PCA-Net are identified, and made precise through lower complexity bounds; the first relates to the complexity of the output distribution, measured by a slow decay of the PCA eigenvalues. The other obstacle relates to the inherent complexity of the space of operators between infinite-dimensional input and output spaces, resulting in a rigorous and quantifiable statement of a “curse of parametric complexity”, an infinite-dimensional analogue of the well-known curse of dimensionality encountered in high-dimensional approximation problems. In addition to these lower bounds, upper complexity bounds are finally derived. A suitable smoothness criterion is shown to ensure an algebraic decay of the PCA eigenvalues. Furthermore, it is shown that PCA-Net can overcome the general curse for specific operators of interest, arising from the Darcy flow and the Navier-Stokes equations.

Cite

Text

Lanthaler. "Operator Learning with PCA-Net: Upper and Lower Complexity Bounds." Journal of Machine Learning Research, 2023.

Markdown

[Lanthaler. "Operator Learning with PCA-Net: Upper and Lower Complexity Bounds." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/lanthaler2023jmlr-operator/)

BibTeX

@article{lanthaler2023jmlr-operator,
  title     = {{Operator Learning with PCA-Net: Upper and Lower Complexity Bounds}},
  author    = {Lanthaler, Samuel},
  journal   = {Journal of Machine Learning Research},
  year      = {2023},
  pages     = {1-67},
  volume    = {24},
  url       = {https://mlanthology.org/jmlr/2023/lanthaler2023jmlr-operator/}
}