Embedding Surfaces by Optimizing Neural Networks with Prescribed Riemannian Metric and Beyond
Abstract
From a machine learning perspective, the problem of solving partial differential equations (PDEs) can be formulated into a least square minimization problem, where neural networks are used to parametrized PDE solutions. Ideally a global minimizer of the square loss corresponds to a solution of the PDE. In this paper we start with a special type of nonlinear PDE arising from differential geometry, the isometric embedding equation, which relates to many long-standing open questions in geometry and analysis. We show that the gradient descent method can identify a global minimizer of the least-square loss function with two-layer neural networks under the assumption of over-parametrization. As a consequence, this solves the surface embedding locally with a prescribed Riemannian metric. We also extend the convergence analysis for gradient descent to higher order linear PDEs with over-parametrization assumption.
Cite
Text
Feng et al. "Embedding Surfaces by Optimizing Neural Networks with Prescribed Riemannian Metric and Beyond." ICML 2023 Workshops: Frontiers4LCD, 2023.Markdown
[Feng et al. "Embedding Surfaces by Optimizing Neural Networks with Prescribed Riemannian Metric and Beyond." ICML 2023 Workshops: Frontiers4LCD, 2023.](https://mlanthology.org/icmlw/2023/feng2023icmlw-embedding/)BibTeX
@inproceedings{feng2023icmlw-embedding,
title = {{Embedding Surfaces by Optimizing Neural Networks with Prescribed Riemannian Metric and Beyond}},
author = {Feng, Yi and Li, Sizhe and Panageas, Ioannis and Wang, Xiao},
booktitle = {ICML 2023 Workshops: Frontiers4LCD},
year = {2023},
url = {https://mlanthology.org/icmlw/2023/feng2023icmlw-embedding/}
}