Rapidly Mixing Multiple-Try Metropolis Algorithms for Model Selection Problems
Abstract
The multiple-try Metropolis (MTM) algorithm is an extension of the Metropolis-Hastings (MH) algorithm by selecting the proposed state among multiple trials according to some weight function. Although MTM has gained great popularity owing to its faster empirical convergence and mixing than the standard MH algorithm, its theoretical mixing property is rarely studied in the literature due to its complex proposal scheme. We prove that MTM can achieve a mixing time bound smaller than that of MH by a factor of the number of trials under a general setting applicable to high-dimensional model selection problems with discrete state spaces. Our theoretical results motivate a new class of weight functions called locally balanced weight functions and guide the choice of the number of trials, which leads to improved performance over standard MTM algorithms. We support our theoretical results by extensive simulation studies and real data applications with several Bayesian model selection problems.
Cite
Text
Chang et al. "Rapidly Mixing Multiple-Try Metropolis Algorithms for Model Selection Problems." Neural Information Processing Systems, 2022.Markdown
[Chang et al. "Rapidly Mixing Multiple-Try Metropolis Algorithms for Model Selection Problems." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/chang2022neurips-rapidly/)BibTeX
@inproceedings{chang2022neurips-rapidly,
title = {{Rapidly Mixing Multiple-Try Metropolis Algorithms for Model Selection Problems}},
author = {Chang, Hyunwoong and Lee, Changwoo and Luo, Zhao Tang and Sang, Huiyan and Zhou, Quan},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/chang2022neurips-rapidly/}
}