A Generic Approach to Accelerating Belief Propagation Based Incomplete Algorithms for DCOPs via a Branch-and-Bound Technique
Abstract
Belief propagation approaches, such as Max-Sum and its variants, are important methods to solve large-scale Distributed Constraint Optimization Problems (DCOPs). However, for problems with n-ary constraints, these algorithms face a huge challenge since their computational complexity scales exponentially with the number of variables a function holds. In this paper, we present a generic and easy-touse method based on a branch-and-bound technique to solve the issue, called Function Decomposing and State Pruning (FDSP). We theoretically prove that FDSP can provide monotonically non-increasing upper bounds and speed up belief propagation based incomplete DCOP algorithms without an effect on solution quality. Also, our empirically evaluation indicates that FDSP can reduce 97% of the search space at least and effectively accelerate Max-Sum, compared with the state-of-the-art.
Cite
Text
Chen et al. "A Generic Approach to Accelerating Belief Propagation Based Incomplete Algorithms for DCOPs via a Branch-and-Bound Technique." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.33016038Markdown
[Chen et al. "A Generic Approach to Accelerating Belief Propagation Based Incomplete Algorithms for DCOPs via a Branch-and-Bound Technique." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/chen2019aaai-generic/) doi:10.1609/AAAI.V33I01.33016038BibTeX
@inproceedings{chen2019aaai-generic,
title = {{A Generic Approach to Accelerating Belief Propagation Based Incomplete Algorithms for DCOPs via a Branch-and-Bound Technique}},
author = {Chen, Ziyu and Jiang, Xingqiong and Deng, Yanchen and Chen, Dingding and He, Zhongshi},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2019},
pages = {6038-6045},
doi = {10.1609/AAAI.V33I01.33016038},
url = {https://mlanthology.org/aaai/2019/chen2019aaai-generic/}
}