Entropy and Inference, Revisited

Abstract

We study properties of popular near–uniform (Dirichlet) priors for learn- ing undersampled probability distributions on discrete nonmetric spaces and show that they lead to disastrous results. However, an Occam–style phase space argument expands the priors into their infinite mixture and resolves most of the observed problems. This leads to a surprisingly good estimator of entropies of discrete distributions.

Cite

Text

Nemenman et al. "Entropy and Inference, Revisited." Neural Information Processing Systems, 2001.

Markdown

[Nemenman et al. "Entropy and Inference, Revisited." Neural Information Processing Systems, 2001.](https://mlanthology.org/neurips/2001/nemenman2001neurips-entropy/)

BibTeX

@inproceedings{nemenman2001neurips-entropy,
  title     = {{Entropy and Inference, Revisited}},
  author    = {Nemenman, Ilya and Shafee, F. and Bialek, William},
  booktitle = {Neural Information Processing Systems},
  year      = {2001},
  pages     = {471-478},
  url       = {https://mlanthology.org/neurips/2001/nemenman2001neurips-entropy/}
}