Semi-Supervised Learning Based on Joint Diffusion of Graph Functions and Laplacians

Abstract

We observe the distances between estimated function outputs on data points to create an anisotropic graph Laplacian which, through an iterative process, can itself be regularized. Our algorithm is instantiated as a discrete regularizer on a graph’s diffusivity operator . This idea is grounded in the theory that regularizing the diffusivity operator corresponds to regularizing the metric on Riemannian manifolds, which further corresponds to regularizing the anisotropic Laplace-Beltrami operator. We show that our discrete regularization framework is consistent in the sense that it converges to (continuous) regularization on underlying data generating manifolds. In semi-supervised learning experiments, across ten standard datasets, our diffusion of Laplacian approach has the lowest average error rate of eight different established and state-of-the-art approaches, which shows the promise of our approach.

Cite

Text

Kim. "Semi-Supervised Learning Based on Joint Diffusion of Graph Functions and Laplacians." European Conference on Computer Vision, 2016. doi:10.1007/978-3-319-46454-1_43

Markdown

[Kim. "Semi-Supervised Learning Based on Joint Diffusion of Graph Functions and Laplacians." European Conference on Computer Vision, 2016.](https://mlanthology.org/eccv/2016/kim2016eccv-semi/) doi:10.1007/978-3-319-46454-1_43

BibTeX

@inproceedings{kim2016eccv-semi,
  title     = {{Semi-Supervised Learning Based on Joint Diffusion of Graph Functions and Laplacians}},
  author    = {Kim, Kwang In},
  booktitle = {European Conference on Computer Vision},
  year      = {2016},
  pages     = {713-729},
  doi       = {10.1007/978-3-319-46454-1_43},
  url       = {https://mlanthology.org/eccv/2016/kim2016eccv-semi/}
}