Probabilistic Watershed: Sampling All Spanning Forests for Seeded Segmentation and Semi-Supervised Learning
Abstract
The seeded Watershed algorithm / minimax semi-supervised learning on a graph computes a minimum spanning forest which connects every pixel / unlabeled node to a seed / labeled node. We propose instead to consider all possible spanning forests and calculate, for every node, the probability of sampling a forest connecting a certain seed with that node. We dub this approach "Probabilistic Watershed". Leo Grady (2006) already noted its equivalence to the Random Walker / Harmonic energy minimization. We here give a simpler proof of this equivalence and establish the computational feasibility of the Probabilistic Watershed with Kirchhoff's matrix tree theorem. Furthermore, we show a new connection between the Random Walker probabilities and the triangle inequality of the effective resistance. Finally, we derive a new and intuitive interpretation of the Power Watershed.
Cite
Text
Sanmartin et al. "Probabilistic Watershed: Sampling All Spanning Forests for Seeded Segmentation and Semi-Supervised Learning." Neural Information Processing Systems, 2019.Markdown
[Sanmartin et al. "Probabilistic Watershed: Sampling All Spanning Forests for Seeded Segmentation and Semi-Supervised Learning." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/sanmartin2019neurips-probabilistic/)BibTeX
@inproceedings{sanmartin2019neurips-probabilistic,
title = {{Probabilistic Watershed: Sampling All Spanning Forests for Seeded Segmentation and Semi-Supervised Learning}},
author = {Sanmartin, Enrique Fita and Damrich, Sebastian and Hamprecht, Fred A.},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {2780-2791},
url = {https://mlanthology.org/neurips/2019/sanmartin2019neurips-probabilistic/}
}