Clustering Under Prior Knowledge with Application to Image Segmentation
Abstract
This paper proposes a new approach to model-based clustering under prior knowl- edge. The proposed formulation can be interpreted from two different angles: as penalized logistic regression, where the class labels are only indirectly observed (via the probability density of each class); as finite mixture learning under a group- ing prior. To estimate the parameters of the proposed model, we derive a (gener- alized) EM algorithm with a closed-form E-step, in contrast with other recent approaches to semi-supervised probabilistic clustering which require Gibbs sam- pling or suboptimal shortcuts. We show that our approach is ideally suited for image segmentation: it avoids the combinatorial nature Markov random field pri- ors, and opens the door to more sophisticated spatial priors (e.g., wavelet-based) in a simple and computationally efficient way. Finally, we extend our formulation to work in unsupervised, semi-supervised, or discriminative modes.
Cite
Text
Cheng et al. "Clustering Under Prior Knowledge with Application to Image Segmentation." Neural Information Processing Systems, 2006.Markdown
[Cheng et al. "Clustering Under Prior Knowledge with Application to Image Segmentation." Neural Information Processing Systems, 2006.](https://mlanthology.org/neurips/2006/cheng2006neurips-clustering/)BibTeX
@inproceedings{cheng2006neurips-clustering,
title = {{Clustering Under Prior Knowledge with Application to Image Segmentation}},
author = {Cheng, Dong S. and Murino, Vittorio and Figueiredo, Mário},
booktitle = {Neural Information Processing Systems},
year = {2006},
pages = {401-408},
url = {https://mlanthology.org/neurips/2006/cheng2006neurips-clustering/}
}