Abstract
Recent work on weighted model counting has been very successfully applied to the problem of probabilistic inference in Bayesian networks. The probability distribution is encoded into a Boolean normal form and compiled to a target language. This results in a more efficient representation of local structure expressed among conditional probabilities. We show that further improvements are possible, by exploiting the knowledge that is lost during the encoding phase and by incorporating it into a compiler inspired by Satisfiability Modulo Theories. Constraints among variables are used as a background theory, which allows us to optimize the Shannon decomposition. We propose a new language, called Weighted Positive Binary Decision Diagrams, that reduces the cost of probabilistic inference by using this decomposition variant to induce an arithmetic circuit of reduced size.
Original language | English |
---|---|
Pages (from-to) | 411-432 |
Number of pages | 22 |
Journal | International Journal of Approximate Reasoning |
Volume | 90 |
DOIs | |
Publication status | Published - Nov 2017 |
Externally published | Yes |
Keywords
- Bayesian networks
- Binary decision diagrams
- Knowledge compilation
- Probabilistic inference
- Weighted model counting
- n/a OA procedure