Unsupervised and Supervised Clustering: The Mutual Information Between Parameters and Observations
Abstract
Recent works in parameter estimation and neural coding have demonstrated that optimal performance are related to the mutual information between parameters and data. We consider the mutual information in the case where the dependency in the parameter (a vector 8) of the conditional p.d.f. of each observation (a vector 0, is through the scalar product 8.~ only. We derive bounds and asymptotic behaviour for the mutual information and compare with results obtained on the same model with the" replica technique" .
Cite
Text
Herschkowitz and Nadal. "Unsupervised and Supervised Clustering: The Mutual Information Between Parameters and Observations." Neural Information Processing Systems, 1998.Markdown
[Herschkowitz and Nadal. "Unsupervised and Supervised Clustering: The Mutual Information Between Parameters and Observations." Neural Information Processing Systems, 1998.](https://mlanthology.org/neurips/1998/herschkowitz1998neurips-unsupervised/)BibTeX
@inproceedings{herschkowitz1998neurips-unsupervised,
title = {{Unsupervised and Supervised Clustering: The Mutual Information Between Parameters and Observations}},
author = {Herschkowitz, Didier and Nadal, Jean-Pierre},
booktitle = {Neural Information Processing Systems},
year = {1998},
pages = {232-238},
url = {https://mlanthology.org/neurips/1998/herschkowitz1998neurips-unsupervised/}
}