Non-Asymptotic Performance Guarantees for Neural Estimation of F-Divergences
Abstract
Statistical distances (SDs), which quantify the dissimilarity between probability distributions, are central to machine learning and statistics. A modern method for estimating such distances from data relies on parametrizing a variational form by a neural network (NN) and optimizing it. These estimators are abundantly used in practice, but corresponding performance guarantees are partial and call for further exploration. In particular, there seems to be a fundamental tradeoff between the two sources of error involved: approximation and estimation. While the former needs the NN class to be rich and expressive, the latter relies on controlling complexity. This paper explores this tradeoff by means of non-asymptotic error bounds, focusing on three popular choices of SDs—Kullback-Leibler divergence, chi-squared divergence, and squared Hellinger distance. Our analysis relies on non-asymptotic function approximation theorems and tools from empirical process theory. Numerical results validating the theory are also provided.
Cite
Text
Sreekumar et al. "Non-Asymptotic Performance Guarantees for Neural Estimation of F-Divergences." Artificial Intelligence and Statistics, 2021.Markdown
[Sreekumar et al. "Non-Asymptotic Performance Guarantees for Neural Estimation of F-Divergences." Artificial Intelligence and Statistics, 2021.](https://mlanthology.org/aistats/2021/sreekumar2021aistats-nonasymptotic/)BibTeX
@inproceedings{sreekumar2021aistats-nonasymptotic,
title = {{Non-Asymptotic Performance Guarantees for Neural Estimation of F-Divergences}},
author = {Sreekumar, Sreejith and Zhang, Zhengxin and Goldfeld, Ziv},
booktitle = {Artificial Intelligence and Statistics},
year = {2021},
pages = {3322-3330},
volume = {130},
url = {https://mlanthology.org/aistats/2021/sreekumar2021aistats-nonasymptotic/}
}