Prediction on a Graph with a Perceptron

Abstract

We study the problem of online prediction of a noisy labeling of a graph with the perceptron. We address both label noise and concept noise. Graph learning is framed as an instance of prediction on a finite set. To treat label noise we show that the hinge loss bounds derived by Gentile [1] for online perceptron learning can be transformed to relative mistake bounds with an optimal leading constant when applied to prediction on a finite set. These bounds depend crucially on the norm of the learned concept. Often the norm of a concept can vary dramatically with only small perturbations in a labeling. We analyze a simple transformation that stabilizes the norm under perturbations. We derive an upper bound that depends only on natural properties of the graph the graph diameter and the cut size of a partitioning of the graph which are only indirectly dependent on the size of the graph. The impossibility of such bounds for the graph geodesic nearest neighbors algorithm will be demonstrated.

Cite

Text

Herbster and Pontil. "Prediction on a Graph with a Perceptron." Neural Information Processing Systems, 2006.

Markdown

[Herbster and Pontil. "Prediction on a Graph with a Perceptron." Neural Information Processing Systems, 2006.](https://mlanthology.org/neurips/2006/herbster2006neurips-prediction/)

BibTeX

@inproceedings{herbster2006neurips-prediction,
  title     = {{Prediction on a Graph with a Perceptron}},
  author    = {Herbster, Mark and Pontil, Massimiliano},
  booktitle = {Neural Information Processing Systems},
  year      = {2006},
  pages     = {577-584},
  url       = {https://mlanthology.org/neurips/2006/herbster2006neurips-prediction/}
}