Recursive Quantile Estimation: Non-Asymptotic Confidence Bounds
Abstract
This paper considers the recursive estimation of quantiles using the stochastic gradient descent (SGD) algorithm with Polyak-Ruppert averaging. The algorithm offers a computationally and memory efficient alternative to the usual empirical estimator. Our focus is on studying the non-asymptotic behavior by providing exponentially decreasing tail probability bounds under mild assumptions on the smoothness of the density functions. This novel non-asymptotic result is based on a bound of the moment generating function of the SGD estimate. We apply our result to the problem of best arm identification in a multi-armed stochastic bandit setting under quantile preferences.
Cite
Text
Chen et al. "Recursive Quantile Estimation: Non-Asymptotic Confidence Bounds." Journal of Machine Learning Research, 2023.Markdown
[Chen et al. "Recursive Quantile Estimation: Non-Asymptotic Confidence Bounds." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/chen2023jmlr-recursive/)BibTeX
@article{chen2023jmlr-recursive,
title = {{Recursive Quantile Estimation: Non-Asymptotic Confidence Bounds}},
author = {Chen, Likai and Keilbar, Georg and Wu, Wei Biao},
journal = {Journal of Machine Learning Research},
year = {2023},
pages = {1-25},
volume = {24},
url = {https://mlanthology.org/jmlr/2023/chen2023jmlr-recursive/}
}