Client-Only Distributed Markov Chain Monte Carlo Sampling over a Network
Abstract
We aim to sample from a target $\exp\left(-\sum_{i=1}^n f_i(x|\mathcal{D}_i\right))$ where each client $f_i$ only has access to local data $\mathcal{D}_i$. We present a fully distributed Markov Chain Monte Carlo (MCMC) sampler that operates through client-to-client communication, eliminating the need for additional centralized servers. Unlike MCMC algorithms that rely on server-client structures, our proposed sampler is entirely distributed, enhancing security and robustness through decentralized communication. In contrast to limited decentralized algorithms arising from Langevin dynamics, our sampler utilizes blocked Gibbs sampling on an augmented distribution. Furthermore, we establish a non-asymptotic analysis of our sampler, employing innovative techniques. This study contributes to one of the initial analyses of the non-asymptotic behavior of a fully distributed sampler arising from Gibbs sampling.
Cite
Text
Yuan et al. "Client-Only Distributed Markov Chain Monte Carlo Sampling over a Network." Transactions on Machine Learning Research, 2025.Markdown
[Yuan et al. "Client-Only Distributed Markov Chain Monte Carlo Sampling over a Network." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/yuan2025tmlr-clientonly/)BibTeX
@article{yuan2025tmlr-clientonly,
title = {{Client-Only Distributed Markov Chain Monte Carlo Sampling over a Network}},
author = {Yuan, Bo and Fan, Jiaojiao and Liang, Jiaming and Chen, Yongxin},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/yuan2025tmlr-clientonly/}
}