Conditional Probability Tree Estimation Analysis and Algorithms
Abstract
We consider the problem of estimating the conditional probability of a label in time O(log n), where n is the number of possible labels. We analyze a natural reduction of this problem to a set of binary regression problems organized in a tree structure, proving a regret bound that scales with the depth of the tree. Motivated by this analysis, we propose the first online algorithm which provably constructs a logarithmic depth tree on the set of labels to solve this problem. We test the algorithm empirically, showing that it works succesfully on a dataset with roughly 106 labels.
Cite
Text
Beygelzimer et al. "Conditional Probability Tree Estimation Analysis and Algorithms." Conference on Uncertainty in Artificial Intelligence, 2009.Markdown
[Beygelzimer et al. "Conditional Probability Tree Estimation Analysis and Algorithms." Conference on Uncertainty in Artificial Intelligence, 2009.](https://mlanthology.org/uai/2009/beygelzimer2009uai-conditional/)BibTeX
@inproceedings{beygelzimer2009uai-conditional,
title = {{Conditional Probability Tree Estimation Analysis and Algorithms}},
author = {Beygelzimer, Alina and Langford, John and Lifshits, Yury and Sorkin, Gregory B. and Strehl, Alexander L.},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2009},
pages = {51-58},
url = {https://mlanthology.org/uai/2009/beygelzimer2009uai-conditional/}
}