Semi-Supervised Eigenvectors for Locally-Biased Learning

Abstract

In many applications, one has information, e.g., labels that are provided in a semi-supervised manner, about a specific target region of a large data set, and one wants to perform machine learning and data analysis tasks nearby that pre-specified target region. Locally-biased problems of this sort are particularly challenging for popular eigenvector-based machine learning and data analysis tools. At root, the reason is that eigenvectors are inherently global quantities. In this paper, we address this issue by providing a methodology to construct semi-supervised eigenvectors of a graph Laplacian, and we illustrate how these locally-biased eigenvectors can be used to perform locally-biased machine learning. These semi-supervised eigenvectors capture successively-orthogonalized directions of maximum variance, conditioned on being well-correlated with an input seed set of nodes that is assumed to be provided in a semi-supervised manner. We also provide several empirical examples demonstrating how these semi-supervised eigenvectors can be used to perform locally-biased learning.

Cite

Text

Hansen and Mahoney. "Semi-Supervised Eigenvectors for Locally-Biased Learning." Neural Information Processing Systems, 2012.

Markdown

[Hansen and Mahoney. "Semi-Supervised Eigenvectors for Locally-Biased Learning." Neural Information Processing Systems, 2012.](https://mlanthology.org/neurips/2012/hansen2012neurips-semisupervised/)

BibTeX

@inproceedings{hansen2012neurips-semisupervised,
  title     = {{Semi-Supervised Eigenvectors for Locally-Biased Learning}},
  author    = {Hansen, Toke and Mahoney, Michael W.},
  booktitle = {Neural Information Processing Systems},
  year      = {2012},
  pages     = {2528-2536},
  url       = {https://mlanthology.org/neurips/2012/hansen2012neurips-semisupervised/}
}