Approximating the Permanent with Deep Rejection Sampling
Abstract
We present a randomized approximation scheme for the permanent of a matrix with nonnegative entries. Our scheme extends a recursive rejection sampling method of Huber and Law (SODA 2008) by replacing the permanent upper bound with a linear combination of the subproblem bounds at a moderately large depth of the recursion tree. This method, we call deep rejection sampling, is empirically shown to outperform the basic, depth-zero variant, as well as a related method by Kuck et al. (NeurIPS 2019). We analyze the expected running time of the scheme on random $(0, 1)$-matrices where each entry is independently $1$ with probability $p$. Our bound is superior to a previous one for $p$ less than $1/5$, matching another bound that was only known to hold when every row and column has density exactly $p$.
Cite
Text
Harviainen et al. "Approximating the Permanent with Deep Rejection Sampling." Neural Information Processing Systems, 2021.Markdown
[Harviainen et al. "Approximating the Permanent with Deep Rejection Sampling." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/harviainen2021neurips-approximating/)BibTeX
@inproceedings{harviainen2021neurips-approximating,
title = {{Approximating the Permanent with Deep Rejection Sampling}},
author = {Harviainen, Juha and Röyskö, Antti and Koivisto, Mikko},
booktitle = {Neural Information Processing Systems},
year = {2021},
url = {https://mlanthology.org/neurips/2021/harviainen2021neurips-approximating/}
}