model counting is the problem of determining the sum of
the weights of the worlds that satisfy a KB:
Weighted model counting (WMC)
Input: CNF C and set of literal weights W
Output: Sum of weights of worlds that satisfy C
Figure 1 depicts graphically the set of inference problems addressed by this paper. Generality increases in the
direction of the arrows. We first propose an algorithm for
propositional weighted model counting and then lift it to
the first-order level. The resulting algorithm is applicable to
all the problems in the diagram. (Weighted SAT/MPE inference requires replacing sums with maxes with an additional
traceback or backtracking step, but we do not pursue it here
and leave it for future work.)
4. PROPOSITIONAL CASE
This section generalizes the Bayesian network inference
techniques in Darwiche4 and Sang et al. 22 to arbitrary propositional PKBs, evidence, and query formulas. Although this
is of value in its own right, its main purpose is to lay the
groundwork for the first-order case.
The probability of a formula is the sum of the probabilities
of the worlds that satisfy it. Thus to compute the probability of
a formula Q given a PKB K it suffices to compute the partition
function of K with and without Q added as a hard formula:
( 2)
where 1Q(x) is the indicator function ( 1 if Q is true in x and 0
otherwise).
In turn, the computation of partition functions can be reduced
to weighted model counting using the procedure in Algorithm 2.
This replaces each soft formula Fi in K by a corresponding
hard formula Fi ⇔ Ai, where Ai is a new atom, and assigns to
every ¬Ai literal a weight of φi (the value of the potential Φi
when Fi is false).
Theorem 1. Z(K) = WMC( WCNF(K) ).
Proof. If a world violates any of the hard clauses in K or
any of the Fi ⇔ Ai equivalences, it does not satisfy C and is
therefore not counted. The weight of each remaining world
x is the product of the weights of the literals that are true in
x. By the Fi ⇔ Ai equivalences and the weights assigned by
WCNF(K), this is ∏i Φi(x). (Recall that Φi(x) = 1 if Fi is true in
x and Φi(x) = φi otherwise.) Thus x’s weight is the unnormalized probability of x under K. Summing these yields the partition function Z(K).
Equation 2 and Theorem 1 lead to Algorithm 3 for PTP.
(Compare with Algorithm 1.) WMC(C, W) can be any weighted
model counting algorithm. 3 Most model counters are variations of Relsat, itself an extension of DPLL. 2 Relsat splits on
atoms until the CNF is decomposed into sub-CNFs sharing
no atoms, and recurses on each sub-CNF. This decomposition is crucial to the efficiency of the algorithm. In this paper
we use a weighted version of Relsat, shown in Algorithm 4.
A(C) is the set of atoms that appear in C. C|A denotes the CNF
obtained by removing the literal A and its negation ¬A from
all clauses in which they appear in and setting to Satisfied all
clauses in which A appears in. Notice that, unlike in DPLL,
satisfied clauses cannot simply be deleted, because we need
to keep track of which atoms they are over for model counting purposes. However, they can be ignored in the decomposition step, since they introduce no dependencies. The atom
to split on in the splitting step can be chosen using various
heuristics. 23
Figure 1. Inference problems addressed in this paper. TPo and TP1
is propositional and first-order theorem proving respectively, PI is
probabilistic inference (computing marginals), MPE is computing the
most probable explanation, SAT is satisfiability, MC is model counting,
W is weighted and L is lifted. A = B means A can be reduced to B.
LWSAT
LMC
MC
Counting
Weighted L i
f
te
d
PTP = LWMC
TP1
TP0 = SAT
MPE = WSAT PI = WMC
Algorithm 3 PTP(PKB K, query Q)
KQ ← K ∪ {(Q, 0)}
return WMC( WCNF( KQ) )/WMC(WCNF(K) )
Algorithm 4 WMC(CNF C, weights W)
// Base case
if all clauses in C are satisfied then
return ∏A∈A(C) (WA + W¬A)
if C has an empty unsatisfied clause then return 0
// Decomposition step
if C can be partitioned into CNFs C1, . . . , Ck sharing no atoms then
return ∏k i= 1 WMC (Ci, W )
// Splitting step
Choose an atom A
return WA WMC(C|A; W) + W¬A WMC(C|¬A; W)
Algorithm 2 WCNF(PKB K)
for all (Fi, φi) ∈ K s.t. φi > 0 do
K ← K ∪ {(Fi ⇔ Ai, 0)} \ {(Fi, φi)}
C ← CNF(K)
for all ¬Ai literals do W¬ Ai ←;φi
for all other literals L do WL ← 1
return (C, W)