understanding modern sat solvers
play

Understanding Modern SAT Solvers VTSA12 Summer School 2012 - PowerPoint PPT Presentation

Understanding Modern SAT Solvers VTSA12 Summer School 2012 Verification Technology, Systems & Applications September 2012 Max Planck Institut Informatik Saarbr ucken, Germany Armin Biere Institute for Formal Models and Verification


  1. Conjunctive Normal Form encoding 21 Definition formula in Conjunctive Normal Form (CNF) is a conjunction of clauses C 1 ∧ C 2 ∧ ... ∧ C n each clause C is a disjunction of literals C = L 1 ∨ ... ∨ L m and each literal is either a plain variable x or a negated variable x . Example ( a ∨ b ∨ c ) ∧ ( a ∨ b ) ∧ ( a ∨ c ) Note 1: two notions for negation: in x and ¬ as in ¬ x for denoting negation. Note 2: original SAT problem is actually formulated for CNF solvers (mostly) expect CNF as input Note 3:

  2. DIMACS Format Example 1 encoding 22 • common ASCII file format of SAT solvers, used by SAT competitions • variables are represented as natural numbers, literals as integers • header “ p cnf <vars> <clauses> ”, comment lines start with “ c ” In order to show the validity of b ∨ a ∧ c ⇐ ( a ∨ b ) ∧ ( b ∨ c ) negate, ( b ∨ a ∧ c ) ∧ ( a ∨ b ) ∧ ( b ∨ c ) simplify and show unsatisfiability of ¬ b ∧ ( ¬ a ∨¬ c ) ∧ ( a ∨ b ) ∧ ( b ∨ c ) c the first two lines are comments c ex1.cnf: a=1, b=2, c=3 p cnf 3 4 -2 0 -1 -3 0 1 2 0 2 3 0

  3. PicoSAT API for Constructing CNFs Example 1 PicoSAT API 23 // compile with: gcc -o ex1 ex1.c picosat.o #include "picosat.h" #include <stdio.h> int main () { int res; picosat_init (); picosat_add (-2); picosat_add (0); picosat_add (-1); picosat_add (-3); picosat_add (0); picosat_add (1); picosat_add (2); picosat_add (0); picosat_add (2); picosat_add (3); picosat_add (0); res = picosat_sat (-1); if (res == 10) printf ("s SATISFIABLE\n"); else if (res == 20) printf ("s UNSATISFIABLE\n"); else printf ("s UNKNOWN\n"); picosat_reset (); return res; }

  4. Satisfying Assignments Example 2 Encoding 24 assume invalid equivalence resp. implication: ( a ∨ b ) ⇒ ( a xor b ) its negation ( a ∨ b ) ∧ ( a = b ) as CNF ( a ∨ b ) ∧ ( ¬ a ∨ b ) ∧ ( ¬ b ∨ a ) c ex2.cnf: a=1,b=2 p cnf 2 3 1 2 0 -1 2 0 -2 1 0 SAT solver then allows to extract one satisfying assignment: $ picosat ex2.cnf s SATISFIABLE v 1 2 0 this is the only one since “assuming” the opposite values individually is UNSAT $ picosat ex2.cnf -a -1; picosat ex2.cnf -a -2 s UNSATISFIABLE s UNSATISFIABLE

  5. Example of Tseitin Transformation: Circuit to CNF Encoding 25 [Tseitin’68] CNF y o ∧ ( x ↔ a ∧ c ) ∧ x ( y ↔ b ∨ x ) ∧ o ( u ↔ a ∨ b ) ∧ u a ( v ↔ b ∨ c ) ∧ ( w ↔ u ∧ v ) ∧ b w ( o ↔ y ⊕ w ) w v c o ∧ ( x → a ) ∧ ( x → c ) ∧ ( x ← a ∧ c ) ∧ ... o ∧ ( x ∨ a ) ∧ ( x ∨ c ) ∧ ( x ∨ a ∨ c ) ∧ ...

  6. Algorithmic Description of Tseitin Transformation Encoding 26 [Tseitin’68] 1. generate a new variable x s for each non input circuit signal s 2. for each gate produce complete input / output constraints as clauses 3. collect all constraints in a big conjunction the transformation is satisfiability equivalent : the result is satisfiable iff and only the original formula is satisfiable not equivalent to the original formula: it has new variables just project satisfying assignment onto the original variables

  7. Tseitin Transformation: Input / Output Constraints Encoding 27 Negation: x ↔ y ⇔ ( x → y ) ∧ ( y → x ) ⇔ ( x ∨ y ) ∧ ( y ∨ x ) Disjunction: x ↔ ( y ∨ z ) ⇔ ( y → x ) ∧ ( z → x ) ∧ ( x → ( y ∨ z )) ⇔ ( y ∨ x ) ∧ ( z ∨ x ) ∧ ( x ∨ y ∨ z ) Conjunction: x ↔ ( y ∧ z ) ⇔ ( x → y ) ∧ ( x → z ) ∧ (( y ∧ z ) → x ) ⇔ ( x ∨ y ) ∧ ( x ∨ z ) ∧ (( y ∧ z ) ∨ x ) ⇔ ( x ∨ y ) ∧ ( x ∨ z ) ∧ ( y ∨ z ∨ x ) Equivalence: x ↔ ( y ↔ z ) ⇔ ( x → ( y ↔ z )) ∧ (( y ↔ z ) → x ) ⇔ ( x → (( y → z ) ∧ ( z → y )) ∧ (( y ↔ z ) → x ) ⇔ ( x → ( y → z )) ∧ ( x → ( z → y )) ∧ (( y ↔ z ) → x ) ⇔ ( x ∨ y ∨ z ) ∧ ( x ∨ z ∨ y ) ∧ (( y ↔ z ) → x ) ⇔ ( x ∨ y ∨ z ) ∧ ( x ∨ z ∨ y ) ∧ ((( y ∧ z ) ∨ ( y ∧ z )) → x ) ⇔ ( x ∨ y ∨ z ) ∧ ( x ∨ z ∨ y ) ∧ (( y ∧ z ) → x ) ∧ (( y ∧ z ) → x ) ⇔ ( x ∨ y ∨ z ) ∧ ( x ∨ z ∨ y ) ∧ ( y ∨ z ∨ x ) ∧ ( y ∨ z ∨ x )

  8. Optimizations for Tseitin Transformation Encoding 28 • goal is smaller CNF less variables, less clauses, so easier to solve (?!) • extract multi argument operands to remove variables for intermediate nodes • half of AND, OR node constraints/clauses can be removed for unnegated nodes [PlaistedGreenbaum’86] – node occurs negated if it has an ancestor which is a negation – half of the constraints determine parent assignment from child assignment – those are unnecessary if node is not used negated – those have to be carefully applied to DAG structure • further structural circuit optimizations ...

  9. CNF Blocked Clause Elimination simulates many encoding / circuit optimizations [BCE+VE](PG) BCE+VE CNF−level simplification VE(PG) BCE(PG) BCE VE PL(PG) PL Circuit−level simplification PG(COI) PG(MIR) PG(NSI) COI MIR NSI PG TST Plaisted−Greenbaum encoding Tseitin encoding [J¨ arvisaloBiereHeule-TACAS’10]

  10. Intermediate Representations Encoding 30 • encoding directly into CNF is hard, so we use intermediate levels: 1. application level 2. bit-precise semantics world-level operations: bit-vector theory 3. bit-level representations such as AIGs or vectors of AIGs 4. CNF • encoding application level formulas into word-level: as generating machine code • word-level to bit-level: bit-blasting similar to hardware synthesis • encoding “logical” constraints is another story

  11. Bit-Blasting of 4-Bit Addition Encoding 31 addition of 4-bit numbers x , y with result s also 4-bit: s = x + y [ s 3 , s 2 , s 1 , s 0 ] 4 = [ x 3 , x 2 , x 1 , x 0 ] 4 +[ y 3 , y 2 , y 1 , y 0 ] 4 [ s 3 , · ] 2 = FullAdder ( x 3 , y 3 , c 2 ) [ s 2 , c 2 ] 2 = FullAdder ( x 2 , y 2 , c 1 ) [ s 1 , c 1 ] 2 = FullAdder ( x 1 , y 1 , c 0 ) [ s 0 , c 0 ] 2 = FullAdder ( x 0 , y 0 , false ) where [ s , o ] 2 = FullAdder ( x , y , i ) with = s x xor y xor i = ( x ∧ y ) ∨ ( x ∧ i ) ∨ ( y ∧ i ) = (( x + y + i ) ≥ 2 ) o

  12. And-Inverter-Graphs (AIG) Encoding 32 • widely adopted bit-level intermediate representation – see for instance our AIGER format http://fmv.jku.at/aiger – used in Hardware Model Checking Competition (HWMCC) – also used in the structural track in SAT competitions – many companies use similar techniques • basic logical operators: conjunction and negation • DAGs: nodes are conjunctions, negation/sign as edge attribute bit stuffing: signs are compactly stored as LSB in pointer • automatic sharing of isomorphic graphs, constant time (peep hole) simplifications • or even SAT sweeping, full reduction, etc ... see ABC system from Berkeley

  13. XOR as AIG Encoding 33 x y negation/sign are edge attributes not part of node x xor y ≡ ( x ∧ y ) ∨ ( x ∧ y ) ≡ ( x ∧ y ) ∧ ( x ∧ y )

  14. Bit-Stuffing Techniques for AIGs in C Encoding 34 typedef struct AIG AIG; struct AIG { enum Tag tag; /* AND, VAR */ void *data[2]; int mark, level; /* traversal */ AIG *next; /* hash collision chain */ }; #define sign_aig(aig) (1 & ( unsigned ) aig) #define not_aig(aig) ((AIG*)(1 ^ ( unsigned ) aig)) #define strip_aig(aig) ((AIG*)(~1 & ( unsigned ) aig)) #define false_aig ((AIG*) 0) #define true_aig ((AIG*) 1) assumption for correctness: sizeof(unsigned) == sizeof(void*)

  15. O0 O0 50 106 ��ΦΜΞ�ΕΗΗΙς O1 102 104 O1 46 48 110 94 100 O2 108 92 98 96 54 38 44 114 86 90 32 30 O2 52 36 42 40 O3 112 84 88 34 2[0] 1[0] 118 78 82 2 4 O3 58 30 34 14 16 O4 116 76 80 36 1[1] 2[1] 122 70 74 6 8 62 56 28 32 18 1[0] 2[0] O5 120 68 72 38 1[2] 2[2] 60 22 26 4 2 126 62 66 10 12 O6 124 60 64 40 1[3] 2[3] 10 12 24 20 2[1] 1[1] O7 130 54 58 16 14 134 128 52 56 42 2[4] 1[4] 1[3] 2[3] 6 8 132 46 50 20 18 1[2] 2[2] 26 28 48 44 2[5] 1[5] ��ΦΜΞ�ΕΗΗΙς 1[7] 2[7] 22 24 1[6] 2[6]

  16. O0 O6 O1 O2 O7 O4 O5 O3 130 O8 334 216 O14 O9 254 348 O10 306 O15 320 O12 292 O13 O11 84 174 328 236 300 128 350 314 332 362 342 274 214 352 252 346 354 364 304 358 318 290 360 356 82 172 234 298 312 326 272 126 330 340 2 212 250 344 302 316 288 60 80 150 170 224 294 232 308 322 296 262 270 104 336 310 324 2[0] 194 124 244 338 248 210 282 286 58 148 222 78 168 230 260 102 268 4 192 242 246 280 208 284 122 48 56 138 146 218 220 68 76 256 166 158 226 228 264 258 92 266 100 2[1] 182 190 238 240 276 278 112 202 206 120 46 136 54 144 66 74 164 156 90 6 180 98 188 110 200 204 118 42 44 132 134 50 52 140 142 62 64 72 160 152 162 86 70 154 88 2[2] 176 178 94 96 184 186 106 108 196 198 114 116 8 12 14 16 18 24 20 26 10 22 28 30 32 34 36 38 40 1[0] 1[1] 1[2] 1[3] 1[4] 1[7] 1[5] 1[8] 2[3] 1[6] 1[9] 1[10] 1[11] 1[12] 1[13] 1[14] 1[15] bit-vector of length 16 shifted by bit-vector of length 4

  17. O0 402 400 398 394 396 392 390 32 388 2[0] 386 384 O1 380 382 406 378 376 298 404 374 300 294 296 370 372 290 292 O2 28 366 368 288 410 2[1] 362 364 284 286 214 408 360 302 280 282 212 356 358 276 278 24 210 352 354 274 216 2[2] 350 348 270 272 206 208 O3 346 304 266 268 204 414 344 342 262 264 200 202 146 412 340 338 260 218 196 198 144 334 336 256 258 192 194 20 142 332 306 252 254 190 148 2[3] 328 330 248 250 186 188 138 140 O4 324 326 246 220 182 184 136 418 322 320 242 244 178 180 132 134 94 416 308 238 240 176 150 128 130 92 234 236 172 174 124 126 16 90 318 232 222 168 170 122 96 2[4] 314 316 228 230 164 166 118 120 86 O5 88 310 312 224 162 152 114 116 84 422 30 26 158 160 110 112 80 82 420 58 1[0] 226 1[1] 154 108 98 76 78 56 22 104 106 72 12 74 54 1[2] 156 100 70 2[5] 60 O6 18 102 66 68 50 52 426 1[3] 62 48 38 424 O7 14 64 44 46 36 428 1[4] 40 42 34 6 8 4 10 2 1[7] 2[6] 2[7] 1[5] 1[6]

  18. Encoding Logical Constraints Encoding 38 • Tseitin’s construction suitable for most kinds of “model constraints” – assuming simple operational semantics: encode an interpreter – small domains: one-hot encoding large domains: binary encoding • harder to encode properties or additional constraints – temporal logic / fix-points – environment constraints • example for fix-points / recursive equations: x = ( a ∨ y ) , y = ( b ∨ x ) – has unique least fix-point x = y = ( a ∨ b ) – and unique largest fix-point x = y = true but unfortunately – only largest fix-point can be (directly) encoded in SAT otherwise need ASP

  19. Example of Logical Constraints: Cardinality Constraints Encoding 39 • given a set of literals { l 1 ,... l n } – constraint the number of literals assigned to true – |{ l 1 ,..., l n }| ≥ k or |{ l 1 ,..., l n }| ≤ k or |{ l 1 ,..., l n }| = k • multiple encodings of cardinality constraints – na¨ ıve encoding exponential: at-most-two quadratic, at-most-three cubic, etc. – quadratic O ( k · n ) encoding goes back to Shannon – linear O ( n ) parallel counter encoding [Sinz’05] – for an O ( n · log n ) encoding see Prestwich’s chapter in our Handbook of SAT • generalization Pseudo-Boolean constraints (PB), e.g. 2 · a + b + c + d + 2 · e ≥ 3 actually used to handle MaxSAT in SAT4J for configuration in Eclipse

  20. BDD based Encoding of Cardinality Constraints Encoding 40 2 ≤ |{ l 1 ,..., l 9 }| ≤ 3 l 1 l 2 l 3 l 4 l 5 l 6 l 7 l 8 l 9 0 l 2 l 3 l 4 l 5 l 6 l 7 l 8 l 9 0 1 l 3 l 4 l 5 l 6 l 7 l 8 l 9 l 4 l 5 l 6 l 7 l 8 l 9 1 0 0 0 0 0 0 “then” edge downward, “else” edge to the right

  21. Example 2 with PicoSAT API PicoSAT API 41 // compile with: gcc -o ex2 ex2.c picosat.o #include "picosat.h" #include <stdio.h> #include <assert.h> int main () { int res, a, b; picosat_init (); picosat_add (1); picosat_add (2); picosat_add (0); picosat_add (-1); picosat_add (2); picosat_add (0); picosat_add (-2); picosat_add (1); picosat_add (0); assert (picosat_sat (-1) == 10); // SATISFIABLE a = picosat_deref (1); b = picosat_deref (2); printf ("v %d %d\n", a*1, b*2); picosat_assume (-a*1); assert (picosat_sat (-1) == 20);//UNSAT picosat_assume (-b*2); assert (picosat_sat (-1) == 20);//UNSAT return res; }

  22. Adding a Blocking Clause to Block Current Solution PicoSAT API 42 static void block_current_solution (void) { int max_idx = picosat_variables (), i; // since ’picosat_add’ resets solutions // need to store it first: signed char * sol = malloc (max_idx + 1); memset (sol, 0, max_idx + 1); for (i = 1; i <= max_idx; i++) sol[i] = (picosat_deref (i) > 0) ? 1 : -1; for (i = 1; i <= max_idx; i++) picosat_add ((sol[i] < 0) ? i : -i); picosat_add (0); free (sol); }

  23. Simplified PicoSAT API PicoSAT API 43 picosat_reset RESET picosat_init picosat_assume picosat_deref picosat_add picosat_failed_assumption SAT READY UNSAT picosat_sat picosat_add picosat_assume picosat_set... picosat_inconsistent picosat_deref_toplevel

  24. Failed Assumptions PicoSAT API 44 • two ways to implement incremental SAT solvers – push / pop as in SMT solvers partial support in SATIRE, zChaff, PicoSAT ∗ clauses associated with context and pushed / popped in a stack like manner ∗ pop discards clauses of current context – most common: assumptions [ClaessenS¨ orensson’03] [E´ enS¨ orensson’03] ∗ allows to use set of literals as assumptions ∗ force SAT solver to first pick assumption as decisions ∗ more flexible, since assumptions can be reused ∗ assumptions are only valid for the next SAT call • failed assumptions: sub set of assumptions inconsistent with CNF

  25. Example: Bit-Vector Under-Approximation PicoSAT API 45 • goal: reduce size of bit-vector constants in satisfying assignments • refinement approach: for each bit-vector variable only use an “effective width” – example: 4-bit vector [ x 3 , x 2 , x 1 , x 0 ] and effective width 2 use [ x 1 , x 1 , x 1 , x 0 ] – either encode from scratch with x 3 and x 2 replaced by x 1 (1) – or add x 3 = x 1 and x 2 = x 1 after push (2) – or add a 2 x → x 3 = x 1 and a 2 x → x 2 = x 1 and assume fresh literal a 2 (3) x • if satisfiable then a solution with small constants has been found otherwise increase eff. width of bit-vectors where it was used to derive UNSAT under-approximations not used then formula UNSAT “used” = “failed assumption” • in (3) constraints are removed by forcing assumptions to the opposite value by adding a unit clause, e.g. ¬ a 2 x in next iteration

  26. Boolector: Lemmas-on-Demand + Underapproximation PicoSAT API 46 [BrummayerBiere’09] Array formula Over−approximate Encode to CNF Refine under−approx. Add under−approx. clauses C Formula is satisfiable Call SAT solver NO YES NO YES spurious? SAT? C used? YES Call SAT solver NO Formula is unsatisfiable Add lemma Refine over−approx.

  27. Clausal Cores PicoSAT API 47 • clausal core (or unsatisfiable sub set) of an unsatisfiable formula – clauses used to derive the empty clause – may include not only original but also learned clauses – similar application as in previous under-approximation example – but also useful for diagnosis of inconsistencies • variable core – sub set of variables occurring in clauses of a clausal core • these cores are not unique and not necessary minimimal • minimimal unsatisfiable sub set (MUS) = clausal core where no clause can be removed

  28. PicoMUS PicoSAT API 48 • PicoMUS is a MUS extractor based on PicoSAT – uses several rounds of clausal core extraction for preprocessing – then switches to assumption based core minimization using picosat failed assumptions – source code serves as a good example on how to use cores / assumptions • new MUS track in SAT 2011 competition – with high- and low-level MUS sub tracks

  29. Examples for Core and MUS Extraction PicoSAT API 49 c ex3.cnf $ picosat ex3.cnf -c core p cnf 6 10 s UNSATISFIABLE 1 2 3 0 $ cat core 1 2 -3 0 p cnf 6 9 1 -2 3 0 2 3 1 0 c ex4.cnf $ picomus ex4.cnf mus 1 -2 -3 0 2 -3 1 0 p cnf 6 11 s UNSATISFIABLE 4 5 6 0 -2 3 1 0 1 2 3 0 $ cat mus 4 5 -6 0 -2 -3 1 0 1 2 -3 0 p cnf 6 6 4 -5 6 0 6 5 4 0 1 -2 3 0 1 2 3 0 4 -5 -6 0 5 -6 4 0 1 -2 -3 0 1 2 -3 0 -1 -4 0 6 4 -5 0 4 5 6 0 1 -2 3 0 1 4 0 4 -6 -5 0 4 5 -6 0 1 -2 -3 0 -1 -4 0 4 -5 6 0 -1 4 0 4 -5 -6 0 -1 -4 0 -1 -4 0 -1 4 0 -1 -4 0

  30. Proof-Traces and TraceCheck PicoSAT API 50 • core extraction in PicoSAT is based on tracing proofs – enabled by picosat enable trace generation – maintains “dependency graph” of learned clauses – kept in memory, so fast core generation • traces can also written to disk in various formats – RUP format by Allen Van Gelder (SAT competition) – or format of TraceCheck tool • TraceCheck can check traces for correctness – orders clauses and antecedents to generate and check resolution proof – (binary) resolution proofs can be dumped

  31. QDIMACS Example PicoSAT API 51 same as DIMACS except that we have additional quantifiers: c SAT c UNSAT p cnf 3 4 p cnf 4 8 a 1 0 a 1 2 0 e 2 3 0 e 3 4 0 -1 -2 3 0 -1 -3 4 0 -1 2 -3 0 -1 3 -4 0 1 2 3 0 1 3 4 0 1 -2 -3 0 1 -3 -4 0 -2 -3 4 0 -2 3 -4 0 2 3 4 0 2 -3 -4 0

  32. DepQBF API 52 /* Create and initialize solver instance. */ QDPLL *qdpll_create (void); /* Delete and release all memory of solver instance. */ void qdpll_delete (QDPLL * qdpll); /* Ensure var table size to be at least ’num’. */ void qdpll_adjust_vars (QDPLL * qdpll, VarID num); /* Open a new scope, where variables can be added by ’qdpll_add’. Returns nesting of new scope. Opened scope can be closed by adding ’0’ via ’qdpll_add’. NOTE: will fail if there is an opened scope already. */ unsigned int qdpll_new_scope (QDPLL * qdpll, QDPLLQuantifierType qtype); /* Add variables or literals to clause or opened scope. If scope is opened, then ’id’ is interpreted as a variable ID, otherwise ’id’ is interpreted as a literal. NOTE: will fail if a scope is opened and ’id’ is negative. */ void qdpll_add (QDPLL * qdpll, LitID id); /* Decide formula. */ QDPLLResult qdpll_sat (QDPLL * qdpll);

  33. DP / DPLL search 53 • dates back to the 50’ies: 1 st version DP is resolution based ⇒ SatELite preprocessor [E´ enBiere05] 2 st version D(P)LL splits space for time CDCL ⇒ • ideas: – 1 st version: eliminate the two cases of assigning a variable in space or – 2 nd version: case analysis in time, e.g. try x = 0 , 1 in turn and recurse • most successful SAT solvers are based on variant (CDCL) of the second version works for very large instances • recent ( ≤ 15 years) optimizations: backjumping, learning, UIPs, dynamic splitting heuristics, fast data structures (we will have a look at each of them)

  34. DP Procedure search 54 [DavisPutnam’61] forever if F = ⊤ return satisfiable if ⊥ ∈ F return unsatisfiable pick remaining variable x add all resolvents on x remove all clauses with x and ¬ x ⇒ SatELite preprocessor [E´ enBiere05]

  35. D(P)LL Procedure search 55 [DavisLogemannLoveland’62] DPLL ( F ) F : = BCP ( F ) boolean constraint propagation if F = ⊤ return satisfiable if ⊥ ∈ F return unsatisfiable pick remaining variable x and literal l ∈ { x , ¬ x } if DPLL ( F ∧{ l } ) returns satisfiable return satisfiable return DPLL ( F ∧{¬ l } ) CDCL ⇒

  36. DPLL Example search 56 [DavisLogemannLoveland’62] clauses a a decision a v b v c a v b v c a = 1 b b c c a v b v c decision a v b v c b = 1 BCP a v b v c c c b b a v b v c c = 0 a v b v c a v b v c

  37. Simple Data Structures in DPLL Implementation search 57 [DavisLogemannLoveland’62] −1 −2 −1 2 1 −2 1 1 2 Variables Clauses 2 3 3 −1 −2 −3 1 −3 2

  38. BCP Example search 58 [DavisLogemannLoveland’62] 0 0 decision level Control Trail 1 X −1 2 Assignment 2 X Variables Clauses 3 −2 3 X X 4 −4 5 X 5

  39. Example cont. search 59 [DavisLogemannLoveland’62] Decide 0 1 0 decision level Control Trail 1 X −1 2 Assignment 2 X Variables Clauses 3 −2 3 X X 4 −4 5 X 5

  40. Example cont. search 60 [DavisLogemannLoveland’62] Assign 0 1 0 1 decision level Control Trail 1 1 −1 2 Assignment 2 X Variables Clauses 3 −2 3 X X 4 −4 5 X 5

  41. Example cont. search 61 [DavisLogemannLoveland’62] BCP 3 0 2 1 0 1 decision level Control Trail 1 1 −1 2 Assignment 2 1 Variables Clauses 3 1 −2 3 X 4 −4 5 X 5

  42. Example cont. search 62 [DavisLogemannLoveland’62] Decide 3 3 0 2 2 0 1 decision level Control Trail 1 1 −1 2 Assignment 2 1 Variables Clauses 3 1 −2 3 X 4 −4 5 X 5

  43. Example cont. search 63 [DavisLogemannLoveland’62] Assign 4 3 3 0 2 2 0 1 decision level Control Trail 1 1 −1 2 Assignment 2 1 Variables Clauses 3 1 −2 3 4 1 −4 5 X 5

  44. Example cont. search 64 [DavisLogemannLoveland’62] 5 BCP 4 3 3 0 2 2 0 1 decision level Control Trail 1 1 −1 2 Assignment 2 1 Variables Clauses 3 1 −2 3 4 1 −4 5 5 1

  45. Conflict Driven Clause Learning (CDCL) search 65 Grasp [MarquesSilvaSakallah’96] clauses a decision a v b v c a v b v c a = 1 b a v b v c decision a v b v c b = 1 BCP a v b v c c a v b v c c = 0 a v b v c a v b v c a v b learn

  46. Conflict Driven Clause Learning (CDCL) search 66 Grasp [MarquesSilvaSakallah’96] clauses a v b v c decision a a v b v c a = 1 a v b v c b BCP a v b v c b = 0 a v b v c c BCP a v b v c c = 0 a v b v c a v b v c a v b a learn

  47. Conflict Driven Clause Learning (CDCL) search 67 Grasp [MarquesSilvaSakallah’96] clauses a v b v c a BCP a v b v c a = 1 a v b v c c decision a v b v c b = 0 a v b v c b BCP a v b v c c = 0 a v b v c a v b v c a v b a learn c

  48. Conflict Driven Clause Learning (CDCL) search 68 Grasp [MarquesSilvaSakallah’96] clauses a v b v c a BCP a a v b v c a = 1 a v b v c c BCP a v b v c b = 0 a v b v c b BCP a v b v c c = 0 a v b v c a v b v c a v b a c learn empty clause

  49. Decision Heuristics search 69 • static heuristics: – one linear order determined before solver is started – usually quite fast to compute, since only calculated once – and thus can also use more expensive algorithms • dynamic heuristics – typically calculated from number of occurences of literals (in unsatisfied clauses) – could be rather expensive, since it requires traversal of all clauses (or more expensive updates in BCP) – effective second order dynamic heuristics (e.g. VSIDS in Chaff)

  50. Other popular Decision Heuristics search 70 • Dynamic Largest Individual Sum (DLIS) – fastest dynamic first order heuristic (e.g. GRASP solver) – choose literal (variable + phase) which occurs most often (ignore satisfied clauses) – requires explicit traversal of CNF (or more expensive BCP) • look-ahead heuristics (e.g. SATZ or MARCH solver) failed literals, probing – trial assignments and BCP for all/some unassigned variables (both phases) – if BCP leads to conflict, enforce toggled assignment of current trial decision – optionally learn binary clauses and perform equivalent literal substitution – decision: most balanced w.r.t. prop. assignments / sat. clauses / reduced clauses – related to our recent Cube & Conquer paper [HeuleKullmanWieringaBiere’11]

  51. Exponential VSIDS (EVSIDS) search 71 Chaff [MoskewiczMadiganZhaoZhangMalik’01] • increment score of involved variables by 1 • decay score of all variables every 256’th conflict by halfing the score • sort priority queue after decay and not at every conflict [E´ enS¨ MiniSAT uses EVSIDS orensson’03/’06] • update score of involved variables as actually LIS would also do δ ′ = δ · 1 • dynamically adjust increment: δ typically increment by 5% − 11 % f • use floating point representation of score • “rescore” to avoid overflow in regular intervals • EVSIDS linearly related to NVSIDS

  52. Normalized VSIDS: NVSIDS search 72 • VSIDS score can be normalized to the interval [0,1] as follows: – pick a decay factor f per conflict: typically f = 0 . 9 – each variable is punished by this decay factor at every conflict – if a variable is involved in conflict, add 1 − f to score – s old score of one fixed variable before conflict, s ′ new score after conflict decay in any case ���� s ′ ≤ s , f ≤ 1 , then s · f + 1 − f ≤ f + 1 − f = 1 � �� � increment if involved • recomputing score of all variables at each conflict is costly – linear in the number of variables, e.g. millions – particularly, because number of involved variabels < < number of variables

  53. Relating EVSIDS and NVSIDS search 73 consider again only one variable with score sequence s n resp. S n � if involved in k -th conflict 1 δ k = otherwise 0 = ( 1 − f ) · δ k i k n n i k · f n − k = ( 1 − f ) · δ k · f n − k ∑ ∑ s n = ( ... ( i 1 · f + i 2 ) · f + i 3 ) · f ··· ) · f + i n = (NVSIDS) k = 1 k = 1 f − n f − n n n δ k · f n − k = δ k · f − k ∑ ∑ S n = ( 1 − f ) · s n = ( 1 − f ) · ( 1 − f ) · (EVSIDS) k = 1 k = 1

  54. BerkMin’s Dynamic Second Order Heuristics search 74 [GoldbergNovikov-DATE’02] • observation: – recently added conflict clauses contain all the good variables of VSIDS – the order of those clauses is not used in VSIDS • basic idea: – simply try to satisfy recently learned clauses first – use VSIDS to choose the decision variable for one clause – if all learned clauses are satisfied use other heuristics • mixed results as other variants VMTF , CMTF (var/clause move to front)

  55. Restarts search 75 • for satisfiable instances the solver may get stuck in the unsatisfiable part – even if the search space contains a large satisfiable part • often it is a good strategy to abandon the current search and restart – restart after the number of decisions reached a restart limit • avoid to run into the same dead end – by randomization (either on the decision variable or its phase) – and/or just keep all the learned clauses • for completeness dynamically increase restart limit

  56. Luby’s Restart Intervals search 76 70 restarts in 104448 conflicts 35 30 25 20 15 10 5 0 0 10 20 30 40 50 60 70

  57. Luby Restart Scheduling search 77 unsigned luby (unsigned i) { unsigned k; for (k = 1; k < 32; k++) if (i == (1 << k) - 1) return 1 << (k - 1); for (k = 1;; k++) if ((1 << (k - 1)) <= i && i < (1 << k) - 1) return luby (i - (1 << (k-1)) + 1); } limit = 512 * luby (++restarts); ... // run SAT core loop for ’limit’ conflicts

  58. Reluctant Doubling Sequence search 78 [Knuth’12] ( u 1 , v 1 ) : = ( 1 , 1 ) ( u n + 1 , v n + 1 ) : = ( u n & − u n = v n ? ( u n + 1 , 1 ) : ( u n , 2 v n )) ( 1 , 1 ) , ( 2 , 1 ) , ( 2 , 2 ) , ( 3 , 1 ) , ( 4 , 1 ) , ( 4 , 2 ) , ( 4 , 4 ) , ( 5 , 1 ) , ...

  59. Phase Saving and Rapid Restarts search 79 • phase assignment / direction heuristics: – assign decision variable to 0 or 1? – only thing that matters in satisfiable instances • “phase saving” as in RSat: – pick phase of last assignment (if not forced to, do not toggle assignment) – initially use statically computed phase (typically LIS) – so can be seen to maintain a global full assignment • rapid restarts: varying restart interval with bursts of restarts – not ony theoretically avoids local minima – empirically works nice together with phase saving

  60. Reusing the Trail search 80 [Van Der Tak, Heule, Ramos POS’11] • in general restarting does not much change much : since phases and scores saved • assignment after restart can only differ if – before restarting – there is a decision literal d assigned on the trail – with smaller score than the next decision n on the priority queue • in this situation backtrack only to decision level of d – simple to compute, particularly if decisions are saved separately – allows to skip many redundant backtracks – allows much higer restart frequency, e.g. base interval 10 for reluctant doubling sequence (Luby)

  61. Reducing Learned Clauses search 81 • keeping all learned clauses slows down BCP kind of quadratically – so SATO and RelSAT just kept only “short” clauses • better periodically delete “useless” learned clauses – keep a certain number of learned clauses “search cache” – if this number is reached MiniSAT reduces (deletes) half of the clauses – keep most active , then shortest , then youngest (FIFO) clauses – after reduction maximum number kept learned clauses is increased geometrically • LBD (Glue) based (apriori!) prediction for usefullness [AudemardSimon’09] – LBD (Glue) = number of decision-levels in the learned clause – allows arithmetic increase of number of kept learned clauses • freeze high PSM (dist. to phase assign.) clauses [AudemardLagniezMazureSais’11]

  62. General Implication Graph as Hyper-Graph search 82 CDCL / Grasp [MarquesSilvaSakallah’96] a original a b c c ∨ ∨ assignments reason implied assignment b

  63. Implication Graph Standard Notation search 83 CDCL / Grasp [MarquesSilvaSakallah’96] c reason associated to a a b ∨ c ∨ original c assignments implied assignment b

  64. Conflict Clauses as Cuts in the Implication Graph search 84 CDCL / Grasp [MarquesSilvaSakallah’96] level −2 n level −1 n level n decision conflict a simple cut always exists: set of roots (decisions) contributing to the conflict

  65. Implication Graph search 85 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 b = 1 @ 0 top−level unit unit c d = 1 @ 1 e = 1 @ 1 = 1 @ 1 decision f g = 1 @ 2 h = 1 @ 2 i = 1 @ 2 = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ

  66. Antecedents / Reasons search 86 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 b = 1 @ 0 top−level unit unit c d e = 1 @ 1 = 1 @ 1 = 1 @ 1 decision f g h = 1 @ 2 i = 1 @ 2 = 1 @ 2 = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ d ∧ g ∧ s → t ≡ ( d ∨ g ∨ s ∨ t )

  67. Conflicting Clauses search 87 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 b = 1 @ 0 top−level unit unit c d = 1 @ 1 e = 1 @ 1 = 1 @ 1 decision f g = 1 @ 2 h = 1 @ 2 i = 1 @ 2 = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ ¬ ( y ∧ z ) ≡ ( y ∨ z )

  68. Resolving Antecedents 1 st Time search 88 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 b = 1 @ 0 top−level unit unit c d = 1 @ 1 e = 1 @ 1 = 1 @ 1 decision f g = 1 @ 2 h i = 1 @ 2 = 1 @ 2 = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ ( h ∨ i ∨ t ∨ y ) ( y ∨ z )

  69. Resolving Antecedents 1 st Time search 89 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 b = 1 @ 0 top−level unit unit c d = 1 @ 1 e = 1 @ 1 = 1 @ 1 decision f g = 1 @ 2 h i = 1 @ 2 = 1 @ 2 = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ ( h ∨ i ∨ t ∨ y ) ( y ∨ z ) ( h ∨ i ∨ t ∨ z )

  70. Resolvents = Cuts = Potential Learned Clauses search 90 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 a = 1 @ 0 b = 1 @ 0 b = 1 @ 0 top−level top−level unit unit unit unit c c = 1 @ 1 = 1 @ 1 d = 1 @ 1 d = 1 @ 1 e = 1 @ 1 e = 1 @ 1 decision decision f f = 1 @ 2 = 1 @ 2 g = 1 @ 2 g = 1 @ 2 h h = 1 @ 2 = 1 @ 2 i i = 1 @ 2 = 1 @ 2 decision decision k = 1 @ 3 k = 1 @ 3 l = 1 @ 3 l = 1 @ 3 decision decision r = 1 @ 4 r = 1 @ 4 s = 1 @ 4 s = 1 @ 4 t t = 1 @ 4 = 1 @ 4 y y = 1 @ 4 = 1 @ 4 decision decision x x = 1 @ 4 = 1 @ 4 z z = 1 @ 4 = 1 @ 4 conflict conflict κ κ ( h ∨ i ∨ t ∨ y ) ( y ∨ z ) ( h ∨ i ∨ t ∨ z )

  71. Potential Learned Clause After 1 Resolution search 91 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 b = 1 @ 0 top−level unit unit c d = 1 @ 1 e = 1 @ 1 = 1 @ 1 decision f g = 1 @ 2 h i = 1 @ 2 = 1 @ 2 = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ ( h ∨ i ∨ t ∨ z )

  72. Resolving Antecedents 2 nd Time search 92 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 b = 1 @ 0 top−level unit unit c = 1 @ 1 d = 1 @ 1 e = 1 @ 1 decision f = 1 @ 2 g = 1 @ 2 h = 1 @ 2 i = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ ( d ∨ g ∨ s ∨ t ) ( h ∨ i ∨ t ∨ z ) ( d ∨ g ∨ s ∨ h ∨ i ∨ z )

  73. Resolving Antecedents 3 rd Time search 93 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 b = 1 @ 0 top−level unit unit c = 1 @ 1 d = 1 @ 1 e = 1 @ 1 decision f = 1 @ 2 g = 1 @ 2 h = 1 @ 2 i = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ ( x ∨ z ) ( d ∨ g ∨ s ∨ h ∨ i ∨ z ) ( x ∨ d ∨ g ∨ s ∨ h ∨ i )

  74. Resolving Antecedents 4 th Time search 94 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 b = 1 @ 0 top−level unit unit c = 1 @ 1 d = 1 @ 1 e = 1 @ 1 decision f = 1 @ 2 g = 1 @ 2 h = 1 @ 2 i = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ ( s ∨ x ) ( x ∨ d ∨ g ∨ s ∨ h ∨ i ) self subsuming resolution ( d ∨ g ∨ s ∨ h ∨ i )

  75. 1 st UIP Clause after 4 Resolutions search 95 CDCL / Grasp [MarquesSilvaSakallah’96] a = 1 @ 0 b = 1 @ 0 top−level unit unit c d e = 1 @ 1 = 1 @ 1 = 1 @ 1 decision f g h i = 1 @ 2 = 1 @ 2 = 1 @ 2 = 1 @ 2 decision backjump level k = 1 @ 3 l = 1 @ 3 decision 1st UIP r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ ( d ∨ g ∨ s ∨ h ∨ i ) UIP = unique implication point dominates conflict on the last level

  76. Simple Algorithm to Find First UIP Clause search 96 CDCL / Grasp [MarquesSilvaSakallah’96] • can be found by graph traversal in the reverse order of made assignments – trail respects this order – mark literals in conflict – traverse reasons of marked variables on trail in reverse order • count number unresolved variables on current decision level – decrease counter if new reason / antecedent clause resolved – if counter=1 (only one unresolved marked variable left) then this node is a UIP – note, decision of current decision level is a UIP and thus a sentinel

  77. Modern CDCL Loop search 97 actual Cleaneling code Status Solver::search ( long limit) { long conflicts = 0; Clause * conflict; Status res = UNKNOWN; while (!res) if (empty) res = UNSATISFIABLE; else if ((conflict = bcp ())) analyze (conflict), conflicts++; else if (conflicts >= limit) break ; else if (reducing ()) reduce (); else if (restarting ()) restart (); else if (!decide ()) res = SATISFIABLE; return res; } Status Solver::solve () { long conflicts = 0, steps = 1e6; Status res; for (;;) if ((res = search (conflicts))) break ; else if ((res = simplify (steps))) break ; else conflicts += 1e4, steps += 1e6; return res; }

  78. Resolving Antecedents 5 th Time search 98 a = 1 @ 0 b = 1 @ 0 top−level unit unit c = 1 @ 1 d = 1 @ 1 e = 1 @ 1 decision f g h i = 1 @ 2 = 1 @ 2 = 1 @ 2 = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ ( l ∨ r ∨ s ) ( d ∨ g ∨ s ∨ h ∨ i ) ( l ∨ r ∨ d ∨ g ∨ h ∨ i )

  79. Decision Learned Clause search 99 a = 1 @ 0 b = 1 @ 0 top−level unit unit c = 1 @ 1 d = 1 @ 1 e = 1 @ 1 decision f g h i = 1 @ 2 = 1 @ 2 = 1 @ 2 = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision backtrack level r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision last UIP x = 1 @ 4 z = 1 @ 4 conflict κ ( d ∨ g ∨ l ∨ r ∨ h ∨ i )

  80. 1 st UIP Clause after 4 Resolutions search 100 a = 1 @ 0 b = 1 @ 0 top−level unit unit c = 1 @ 1 d = 1 @ 1 e = 1 @ 1 decision f g h i = 1 @ 2 = 1 @ 2 = 1 @ 2 = 1 @ 2 decision k = 1 @ 3 l = 1 @ 3 decision r = 1 @ 4 s = 1 @ 4 t = 1 @ 4 y = 1 @ 4 decision x = 1 @ 4 z = 1 @ 4 conflict κ ( d ∨ g ∨ s ∨ h ∨ i )

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend