Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy
Abstract
We provide a method that approximates the Bayes error rate and the Shannon entropy with high probability. The Bayes error rate approximation makes possible to build a classifier that polynomially approaches Bayes error rate. The Shannon entropy approximation provides provable performance guarantees for learning trees and Bayesian networks from continuous variables. Our results rely on some reasonable regularity conditions of the unknown probability distributions, and apply to bounded as well as unbounded variables.
Cite
Text
Honorio and Tommi. "Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy." International Conference on Machine Learning, 2013.Markdown
[Honorio and Tommi. "Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/honorio2013icml-twosided/)BibTeX
@inproceedings{honorio2013icml-twosided,
title = {{Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy}},
author = {Honorio, Jean and Tommi, Jaakkola},
booktitle = {International Conference on Machine Learning},
year = {2013},
pages = {459-467},
volume = {28},
url = {https://mlanthology.org/icml/2013/honorio2013icml-twosided/}
}