Entropy Estimation via Normalizing Flow

Abstract

Entropy estimation is an important problem in information theory and statistical science. Many popular entropy estimators suffer from fast growing estimation bias with respect to dimensionality, rendering them unsuitable for high dimensional problems. In this work we propose a transformbased method for high dimensional entropy estimation, which consists of the following two main ingredients. First by modifying the k-NN based entropy estimator, we propose a new estimator which enjoys small estimation bias for samples that are close to a uniform distribution. Second we design a normalizing flow based mapping that pushes samples toward a uniform distribution, and the relation between the entropy of the original samples and the transformed ones is also derived. As a result the entropy of a given set of samples is estimated by first transforming them toward a uniform distribution and then applying the proposed estimator to the transformed samples. Numerical experiments demonstrate the effectiveness of the method for high dimensional entropy estimation problems.

Cite

Text

Ao and Li. "Entropy Estimation via Normalizing Flow." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I9.21237

Markdown

[Ao and Li. "Entropy Estimation via Normalizing Flow." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/ao2022aaai-entropy/) doi:10.1609/AAAI.V36I9.21237

BibTeX

@inproceedings{ao2022aaai-entropy,
  title     = {{Entropy Estimation via Normalizing Flow}},
  author    = {Ao, Ziqiao and Li, Jinglai},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2022},
  pages     = {9990-9998},
  doi       = {10.1609/AAAI.V36I9.21237},
  url       = {https://mlanthology.org/aaai/2022/ao2022aaai-entropy/}
}