Learning Globally Smooth Functions on Manifolds

Abstract

Smoothness and low dimensional structures play central roles in improving generalization and stability in learning and statistics. This work combines techniques from semi-infinite constrained learning and manifold regularization to learn representations that are globally smooth on a manifold. To do so, it shows that under typical conditions the problem of learning a Lipschitz continuous function on a manifold is equivalent to a dynamically weighted manifold regularization problem. This observation leads to a practical algorithm based on a weighted Laplacian penalty whose weights are adapted using stochastic gradient techniques. It is shown that under mild conditions, this method estimates the Lipschitz constant of the solution, learning a globally smooth solution as a byproduct. Experiments on real world data illustrate the advantages of the proposed method relative to existing alternatives. Our code is available at https://github.com/JuanCervino/smoothbench.

Cite

Text

Cervino et al. "Learning Globally Smooth Functions on Manifolds." International Conference on Machine Learning, 2023.

Markdown

[Cervino et al. "Learning Globally Smooth Functions on Manifolds." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/cervino2023icml-learning/)

BibTeX

@inproceedings{cervino2023icml-learning,
  title     = {{Learning Globally Smooth Functions on Manifolds}},
  author    = {Cervino, Juan and Chamon, Luiz F. O. and Haeffele, Benjamin David and Vidal, Rene and Ribeiro, Alejandro},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {3815-3854},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/cervino2023icml-learning/}
}