Goodness-of-Fit Testing for Discrete Distributions via Stein Discrepancy
Abstract
Recent work has combined Stein’s method with reproducing kernel Hilbert space theory to develop nonparametric goodness-of-fit tests for un-normalized probability distributions. However, the currently available tests apply exclusively to distributions with smooth density functions. In this work, we introduce a kernelized Stein discrepancy measure for discrete spaces, and develop a nonparametric goodness-of-fit test for discrete distributions with intractable normalization constants. Furthermore, we propose a general characterization of Stein operators that encompasses both discrete and continuous distributions, providing a recipe for constructing new Stein operators. We apply the proposed goodness-of-fit test to three statistical models involving discrete distributions, and our experiments show that the proposed test typically outperforms a two-sample test based on the maximum mean discrepancy.
Cite
Text
Yang et al. "Goodness-of-Fit Testing for Discrete Distributions via Stein Discrepancy." International Conference on Machine Learning, 2018.Markdown
[Yang et al. "Goodness-of-Fit Testing for Discrete Distributions via Stein Discrepancy." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/yang2018icml-goodnessoffit/)BibTeX
@inproceedings{yang2018icml-goodnessoffit,
title = {{Goodness-of-Fit Testing for Discrete Distributions via Stein Discrepancy}},
author = {Yang, Jiasen and Liu, Qiang and Rao, Vinayak and Neville, Jennifer},
booktitle = {International Conference on Machine Learning},
year = {2018},
pages = {5561-5570},
volume = {80},
url = {https://mlanthology.org/icml/2018/yang2018icml-goodnessoffit/}
}