Parallel probabilistic inference by weighted model counting

Giso H. Dal, Alfons W. Laarman, Peter J.F. Lucas

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

2 Citations (Scopus)
11 Downloads (Pure)

Abstract

Knowledge compilation as part of the Weighted Model Counting approach has proven to be an efficient tool for exact inference in probabilistic graphical models, by exploiting structures that more traditional methods can not. The availability of affordable high performance commodity hardware has been an inspiration for other inference approaches to exploit parallelism, to great suc- cess. In this paper, we explore the possibilities for Weighted Model Counting. We have empirically confirmed that exploited parallelism yields substantial speedups using a set of real-world Bayesian networks.
Original languageEnglish
Title of host publicationProceedings of the Ninth International Conference on Probabilistic Graphical Models
Subtitle of host publication11-14 September 2018, Prague, Czech Republic
EditorsVáclav Kratochvíl, Milan Studený
PublisherMLResearchPress
Pages97-108
Number of pages12
Publication statusPublished - 2018
Externally publishedYes
Event9th International Conference on Probabilistic Graphical Models, PGM 2018 - Prague, Czech Republic
Duration: 11 Sept 201814 Sept 2018
Conference number: 9

Publication series

NameProceedings of Machine Learning Research (PMLR)
PublisherJMLR
Volume72
ISSN (Electronic)2640-3498

Conference

Conference9th International Conference on Probabilistic Graphical Models, PGM 2018
Abbreviated titlePGM 2018
Country/TerritoryCzech Republic
CityPrague
Period11/09/1814/09/18

Fingerprint

Dive into the research topics of 'Parallel probabilistic inference by weighted model counting'. Together they form a unique fingerprint.

Cite this