Preprocessing for Propositional Model Counting
Abstract
This paper is concerned with preprocessing techniques for propositional model counting. We have implemented a preprocessor which includes many elementary preprocessing techniques, including occurrence reduction, vivification, backbone identification, as well as equivalence, AND and XOR gate identification and replacement. We performed intensive experiments, using a huge number of benchmarks coming from a large number of families. Two approaches to model counting have been considered downstream: ”direct” model counting using Cachet and compilation-based model counting, based on the C2D compiler. The experimental results we have obtained show that our preprocessor is both efficient and robust.
Cite
Text
Lagniez and Marquis. "Preprocessing for Propositional Model Counting." AAAI Conference on Artificial Intelligence, 2014. doi:10.1609/AAAI.V28I1.9116Markdown
[Lagniez and Marquis. "Preprocessing for Propositional Model Counting." AAAI Conference on Artificial Intelligence, 2014.](https://mlanthology.org/aaai/2014/lagniez2014aaai-preprocessing/) doi:10.1609/AAAI.V28I1.9116BibTeX
@inproceedings{lagniez2014aaai-preprocessing,
title = {{Preprocessing for Propositional Model Counting}},
author = {Lagniez, Jean-Marie and Marquis, Pierre},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2014},
pages = {2688-2694},
doi = {10.1609/AAAI.V28I1.9116},
url = {https://mlanthology.org/aaai/2014/lagniez2014aaai-preprocessing/}
}