Minimax Regret on Patterns Using Kullback-Leibler Divergence Covering
Abstract
This paper considers the problem of finding a tighter upper bound on the minimax regret of patterns, a class used to study large-alphabet distributions which avoids infinite asymptotic regret and redundancy. Our method for finding upper bounds for minimax regret uses cover numbers with Kullback-Leibler (KL) divergence as the distance. Compared to existing results by Acharya et al. (2013), we are able to improve the power of the exponent on the logarithmic term, giving a minimax regret bound which matches the best known minimax redundancy bound on patterns.
Cite
Text
Tang. "Minimax Regret on Patterns Using Kullback-Leibler Divergence Covering." Conference on Learning Theory, 2022.Markdown
[Tang. "Minimax Regret on Patterns Using Kullback-Leibler Divergence Covering." Conference on Learning Theory, 2022.](https://mlanthology.org/colt/2022/tang2022colt-minimax/)BibTeX
@inproceedings{tang2022colt-minimax,
title = {{Minimax Regret on Patterns Using Kullback-Leibler Divergence Covering}},
author = {Tang, Jennifer},
booktitle = {Conference on Learning Theory},
year = {2022},
pages = {3095-3112},
volume = {178},
url = {https://mlanthology.org/colt/2022/tang2022colt-minimax/}
}