conflict driven clause learning
play

Conflict-Driven Clause Learning Current Trends in AI Bart Bogaerts - PowerPoint PPT Presentation

Conflict-Driven Clause Learning Current Trends in AI Bart Bogaerts March 13, 2020 1 THANKS Many slides provided by Joao Marques Silva Material of a SAT/SMT summer school http: //satsmt2013.ics.aalto.fi/slides/Marques-Silva.pdf


  1. CLAUSE LEARNING Level Dec. Unit Prop. ∅ 0 a ∨ ¯ (¯ b ) (¯ z ∨ b ) (¯ x ∨ ¯ z ∨ a ) 1 x (¯ a ∨ ¯ z ) 2 y ⊥ (¯ x ∨ ¯ 3 z a z ) b ◮ Analyze conflict ◮ Reasons: x and z ◮ Decision variable & literals assigned at lower decision levels ◮ Create new clause: (¯ x ∨ ¯ z ) ◮ Can relate clause learning with resolution 14

  2. CLAUSE LEARNING Level Dec. Unit Prop. ∅ 0 a ∨ ¯ (¯ b ) (¯ z ∨ b ) (¯ x ∨ ¯ z ∨ a ) 1 x (¯ a ∨ ¯ z ) 2 y ⊥ (¯ x ∨ ¯ 3 z a z ) b ◮ Analyze conflict ◮ Reasons: x and z ◮ Decision variable & literals assigned at lower decision levels ◮ Create new clause: (¯ x ∨ ¯ z ) ◮ Can relate clause learning with resolution 14

  3. CLAUSE LEARNING Level Dec. Unit Prop. ∅ 0 a ∨ ¯ (¯ b ) (¯ z ∨ b ) (¯ x ∨ ¯ z ∨ a ) 1 x (¯ a ∨ ¯ z ) 2 y ⊥ (¯ x ∨ ¯ 3 z a z ) b ◮ Analyze conflict ◮ Reasons: x and z ◮ Decision variable & literals assigned at lower decision levels ◮ Create new clause: (¯ x ∨ ¯ z ) ◮ Can relate clause learning with resolution 14

  4. CLAUSE LEARNING Level Dec. Unit Prop. ∅ 0 a ∨ ¯ (¯ b ) (¯ z ∨ b ) (¯ x ∨ ¯ z ∨ a ) 1 x (¯ a ∨ ¯ z ) 2 y ⊥ (¯ x ∨ ¯ 3 z a z ) b ◮ Analyze conflict ◮ Reasons: x and z ◮ Decision variable & literals assigned at lower decision levels ◮ Create new clause: (¯ x ∨ ¯ z ) ◮ Can relate clause learning with resolution ◮ Learned clauses result from ( selected ) resolution operations 14

  5. CLAUSE LEARNING – AFTER BRACKTRACKING Level Dec. Unit Prop. ∅ 0 1 x z 2 y ⊥ 3 z z a b 15

  6. CLAUSE LEARNING – AFTER BRACKTRACKING Level Dec. Unit Prop. ∅ 0 1 x z 2 y ⊥ 3 z a b ◮ Clause (¯ x ∨ ¯ z ) is asserting at decision level 1 15

  7. CLAUSE LEARNING – AFTER BRACKTRACKING Level Dec. Unit Prop. Level Dec. Unit Prop. ∅ ∅ 0 0 ¯ 1 x z 1 x z 2 y ⊥ 3 z a b ◮ Clause (¯ x ∨ ¯ z ) is asserting at decision level 1 15

  8. CLAUSE LEARNING – AFTER BRACKTRACKING Level Dec. Unit Prop. Level Dec. Unit Prop. ∅ ∅ 0 0 ¯ 1 x z 1 x z 2 y ⊥ 3 z a b ◮ Clause (¯ x ∨ ¯ z ) is asserting at decision level 1 ◮ Learned clauses are always asserting [MSS96,MSS99] ◮ Backtracking differs from plain DPLL: ◮ Always bactrack after a conflict [MMZZM01] 15

  9. UNIQUE IMPLICATION POINTS (UIPS) Level Dec. Unit Prop. ∅ 0 1 w w 2 x 3 y y 4 z z a c ⊥ b 16

  10. UNIQUE IMPLICATION POINTS (UIPS) (¯ b ∨ ¯ c ) (¯ w ∨ c ) (¯ x ∨ ¯ a ∨ b ) (¯ y ∨ ¯ z ∨ a ) Level Dec. Unit Prop. ∅ 0 w ∨ ¯ (¯ b ) 1 w 2 x x (¯ (¯ w ∨ ¯ w ∨ ¯ x ∨ ¯ x ∨ ¯ a ) a ) 3 y y (¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) 4 z z a c ⊥ b ◮ Learn clause (¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) 16

  11. UNIQUE IMPLICATION POINTS (UIPS) (¯ b ∨ ¯ c ) (¯ w ∨ c ) (¯ x ∨ ¯ a ∨ b ) (¯ y ∨ ¯ z ∨ a ) Level Dec. Unit Prop. ∅ 0 w ∨ ¯ (¯ b ) 1 w 2 x x (¯ (¯ w ∨ ¯ w ∨ ¯ x ∨ ¯ x ∨ ¯ a ) a ) 3 y y (¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) 4 z z a a c ⊥ b ◮ Learn clause (¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) ◮ But a is a UIP 16

  12. UNIQUE IMPLICATION POINTS (UIPS) (¯ b ∨ ¯ c ) (¯ w ∨ c ) (¯ x ∨ ¯ a ∨ b ) (¯ y ∨ ¯ z ∨ a ) Level Dec. Unit Prop. ∅ 0 w ∨ ¯ (¯ b ) 1 w 2 x (¯ (¯ w ∨ ¯ w ∨ ¯ x ∨ ¯ x ∨ ¯ a ) a ) 3 y (¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) 4 z a c ⊥ b ◮ Learn clause (¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) ◮ But a is a UIP ◮ Learn clause (¯ w ∨ ¯ x ∨ ¯ a ) 16

  13. MULTIPLE UIPS Level Dec. Unit Prop. ∅ 0 1 w w w 2 x 3 y y 4 z r a a a c ⊥ s b 17

  14. MULTIPLE UIPS ◮ First UIP: Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ y ∨ ¯ ∅ a ) 0 1 w 2 x 3 y y 4 z r a c ⊥ s b 17

  15. MULTIPLE UIPS ◮ First UIP: Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ y ∨ ¯ ∅ a ) 0 ◮ But there can be more than 1 UIP 1 w 2 x x 3 y y 4 z r a c ⊥ s b 17

  16. MULTIPLE UIPS ◮ First UIP: Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ y ∨ ¯ ∅ a ) 0 ◮ But there can be more than 1 UIP 1 w ◮ Second UIP: ◮ Learn clause (¯ x ∨ ¯ z ∨ a ) 2 x 3 y 4 z r a c ⊥ s b 17

  17. MULTIPLE UIPS ◮ First UIP: Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ y ∨ ¯ ∅ a ) 0 ◮ But there can be more than 1 UIP 1 w ◮ Second UIP: ◮ Learn clause (¯ x ∨ ¯ z ∨ a ) 2 x ◮ In practice smaller clauses more effective 3 y ◮ Compare with (¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) 4 z r a c ⊥ s b 17

  18. MULTIPLE UIPS ◮ First UIP: Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ y ∨ ¯ ∅ a ) 0 ◮ But there can be more than 1 UIP 1 w ◮ Second UIP: ◮ Learn clause (¯ x ∨ ¯ z ∨ a ) 2 x ◮ In practice smaller clauses more effective 3 y ◮ Compare with (¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) 4 z r a c ⊥ s b ◮ Multiple UIPs proposed in GRASP [MSS96] ◮ First UIP learning proposed in Chaff [MMZZM01] ◮ Not used in recent state of the art CDCL SAT solvers 17

  19. MULTIPLE UIPS ◮ First UIP: Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ y ∨ ¯ ∅ a ) 0 ◮ But there can be more than 1 UIP 1 w ◮ Second UIP: ◮ Learn clause (¯ x ∨ ¯ z ∨ a ) 2 x ◮ In practice smaller clauses more effective 3 y ◮ Compare with (¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) 4 z r a c ⊥ s b ◮ Multiple UIPs proposed in GRASP [MSS96] ◮ First UIP learning proposed in Chaff [MMZZM01] ◮ Not used in recent state of the art CDCL SAT solvers ◮ Recent results show it can be beneficial on current instances [SSS12] 17

  20. CLAUSE MINIMIZATION I Level Dec. Unit Prop. ∅ 0 1 x x b b b 2 y ⊥ 3 z c a 18

  21. CLAUSE MINIMIZATION I z ∨ ¯ (¯ a ∨ ¯ c ) (¯ b ∨ c ) (¯ x ∨ ¯ y ∨ ¯ z ∨ a ) Level Dec. Unit Prop. ∅ 0 z ∨ ¯ (¯ b ∨ ¯ a ) 1 x b 2 y y z ∨ ¯ (¯ x ∨ ¯ y ∨ ¯ b ) ⊥ 3 z z c a z ∨ ¯ ◮ Learn clause (¯ x ∨ ¯ y ∨ ¯ b ) 18

  22. CLAUSE MINIMIZATION I z ∨ ¯ (¯ a ∨ ¯ c ) (¯ b ∨ c ) (¯ x ∨ ¯ y ∨ ¯ z ∨ a ) (¯ x ∨ b ) Level Dec. Unit Prop. ∅ 0 z ∨ ¯ (¯ b ∨ ¯ a ) 1 x b 2 y y z ∨ ¯ (¯ x ∨ ¯ y ∨ ¯ b ) ⊥ 3 z z c a z ∨ ¯ ◮ Learn clause (¯ x ∨ ¯ y ∨ ¯ b ) ◮ Apply self-subsuming resolution (i.e. local minimization) [SB09] 18

  23. CLAUSE MINIMIZATION I z ∨ ¯ (¯ a ∨ ¯ c ) (¯ b ∨ c ) (¯ x ∨ ¯ y ∨ ¯ z ∨ a ) (¯ x ∨ b ) Level Dec. Unit Prop. ∅ 0 z ∨ ¯ (¯ b ∨ ¯ a ) 1 x b 2 y z ∨ ¯ (¯ x ∨ ¯ y ∨ ¯ b ) ⊥ 3 z c (¯ x ∨ ¯ y ∨ ¯ z ) a z ∨ ¯ ◮ Learn clause (¯ x ∨ ¯ y ∨ ¯ b ) ◮ Apply self-subsuming resolution (i.e. local minimization) 18

  24. CLAUSE MINIMIZATION I z ∨ ¯ (¯ a ∨ ¯ c ) (¯ b ∨ c ) (¯ x ∨ ¯ y ∨ ¯ z ∨ a ) (¯ x ∨ b ) Level Dec. Unit Prop. ∅ 0 z ∨ ¯ (¯ b ∨ ¯ a ) 1 x b 2 y z ∨ ¯ (¯ x ∨ ¯ y ∨ ¯ b ) ⊥ 3 z c (¯ x ∨ ¯ y ∨ ¯ z ) a z ∨ ¯ ◮ Learn clause (¯ x ∨ ¯ y ∨ ¯ b ) ◮ Apply self-subsuming resolution (i.e. local minimization) ◮ Learn clause (¯ x ∨ ¯ y ∨ ¯ z ) 18

  25. CLAUSE MINIMIZATION II Level Dec. Unit Prop. ∅ 0 1 w w a c c b 2 x e ⊥ d 19

  26. CLAUSE MINIMIZATION II Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ x ∨ ¯ c ) ∅ 0 1 w a c b 2 x x e ⊥ d 19

  27. CLAUSE MINIMIZATION II Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ x ∨ ¯ c ) ∅ ◮ Cannot apply self-subsuming 0 resolution 1 w a c ◮ Resolving with reason of c yields a ∨ ¯ (¯ w ∨ ¯ x ∨ ¯ b ) b 2 x x e ⊥ d 19

  28. CLAUSE MINIMIZATION II Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ x ∨ ¯ c ) ∅ ◮ Cannot apply self-subsuming 0 resolution 1 w a c ◮ Resolving with reason of c yields a ∨ ¯ (¯ w ∨ ¯ x ∨ ¯ b ) b ◮ Can apply recursive minimization 2 x x e ⊥ d 19

  29. CLAUSE MINIMIZATION II Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ x ∨ ¯ c ) ∅ ◮ Cannot apply self-subsuming 0 resolution 1 w a c ◮ Resolving with reason of c yields a ∨ ¯ (¯ w ∨ ¯ x ∨ ¯ b ) b ◮ Can apply recursive minimization 2 x x e ⊥ d ◮ Marked nodes: literals in learned clause [SB09] 19

  30. CLAUSE MINIMIZATION II Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ x ∨ ¯ c ) ∅ ◮ Cannot apply self-subsuming 0 resolution 1 w a c ◮ Resolving with reason of c yields a ∨ ¯ (¯ w ∨ ¯ x ∨ ¯ b ) b ◮ Can apply recursive minimization 2 x e ⊥ d ◮ Marked nodes: literals in learned clause [SB09] ◮ Trace back from c until marked nodes or new nodes / decisions ◮ Learn clause if only marked nodes visited 19

  31. CLAUSE MINIMIZATION II Level Dec. Unit Prop. ◮ Learn clause (¯ w ∨ ¯ x ∨ ¯ c ) ∅ ◮ Cannot apply self-subsuming 0 resolution 1 w a c ◮ Resolving with reason of c yields a ∨ ¯ (¯ w ∨ ¯ x ∨ ¯ b ) b ◮ Can apply recursive minimization ◮ Learn clause (¯ w ∨ ¯ x ) 2 x e ⊥ d ◮ Marked nodes: literals in learned clause [SB09] ◮ Trace back from c until marked nodes or new nodes / decisions ◮ Learn clause if only marked nodes visited 19

  32. SEARCH RESTARTS I ◮ Heavy-tail behavior: [GSK98] ◮ 10000 runs, branching randomization on industrial instance ◮ Use rapid randomized restarts (search restarts) 20

  33. SEARCH RESTARTS II ◮ Restart search after a number of conflicts cutoff cutoff solution 21

  34. SEARCH RESTARTS II ◮ Restart search after a number of conflicts ◮ Increase cutoff after each restart ◮ Guarantees completeness ◮ Different policies exist (see refs) cutoff cutoff solution 21

  35. SEARCH RESTARTS II ◮ Restart search after a number of conflicts ◮ Increase cutoff after each restart ◮ Guarantees completeness ◮ Different policies exist (see refs) cutoff cutoff solution ◮ Works for SAT & UNSAT instances. Why? 21

  36. SEARCH RESTARTS II ◮ Restart search after a number of conflicts ◮ Increase cutoff after each restart ◮ Guarantees completeness ◮ Different policies exist (see refs) ◮ Works for SAT & UNSAT instances. Why? ◮ Learned clauses effective after restart(s) 21

  37. DATA STRUCTURES BASICS ◮ Each literal l should access clauses containing l ◮ Why? 22

  38. DATA STRUCTURES BASICS ◮ Each literal l should access clauses containing l ◮ Why? Unit propagation 22

  39. DATA STRUCTURES BASICS ◮ Each literal l should access clauses containing l ◮ Why? Unit propagation ◮ Clause with k literals results in k references, from literals to the clause 22

  40. DATA STRUCTURES BASICS ◮ Each literal l should access clauses containing l ◮ Why? Unit propagation ◮ Clause with k literals results in k references, from literals to the clause ◮ Number of clause references equals number of literals, L 22

  41. DATA STRUCTURES BASICS ◮ Each literal l should access clauses containing l ◮ Why? Unit propagation ◮ Clause with k literals results in k references, from literals to the clause ◮ Number of clause references equals number of literals, L ◮ Clause learning can generate large clauses ◮ Worst-case size: O ( n ) 22

  42. DATA STRUCTURES BASICS ◮ Each literal l should access clauses containing l ◮ Why? Unit propagation ◮ Clause with k literals results in k references, from literals to the clause ◮ Number of clause references equals number of literals, L ◮ Clause learning can generate large clauses ◮ Worst-case size: O ( n ) ◮ Worst-case number of literals: O ( m n ) 22

  43. DATA STRUCTURES BASICS ◮ Each literal l should access clauses containing l ◮ Why? Unit propagation ◮ Clause with k literals results in k references, from literals to the clause ◮ Number of clause references equals number of literals, L ◮ Clause learning can generate large clauses ◮ Worst-case size: O ( n ) ◮ Worst-case number of literals: O ( m n ) ◮ In practice, Unit propagation slow-down worse than linear as clauses are learned ! 22

  44. DATA STRUCTURES BASICS ◮ Each literal l should access clauses containing l ◮ Why? Unit propagation ◮ Clause with k literals results in k references, from literals to the clause ◮ Number of clause references equals number of literals, L ◮ Clause learning can generate large clauses ◮ Worst-case size: O ( n ) ◮ Worst-case number of literals: O ( m n ) ◮ In practice, Unit propagation slow-down worse than linear as clauses are learned ! ◮ Clause learning to be effective requires a more efficient representation: 22

  45. DATA STRUCTURES BASICS ◮ Each literal l should access clauses containing l ◮ Why? Unit propagation ◮ Clause with k literals results in k references, from literals to the clause ◮ Number of clause references equals number of literals, L ◮ Clause learning can generate large clauses ◮ Worst-case size: O ( n ) ◮ Worst-case number of literals: O ( m n ) ◮ In practice, Unit propagation slow-down worse than linear as clauses are learned ! ◮ Clause learning to be effective requires a more efficient representation: Watched Literals 22

  46. DATA STRUCTURES BASICS ◮ Each literal l should access clauses containing l ◮ Why? Unit propagation ◮ Clause with k literals results in k references, from literals to the clause ◮ Number of clause references equals number of literals, L ◮ Clause learning can generate large clauses ◮ Worst-case size: O ( n ) ◮ Worst-case number of literals: O ( m n ) ◮ In practice, Unit propagation slow-down worse than linear as clauses are learned ! ◮ Clause learning to be effective requires a more efficient representation: Watched Literals ◮ Watched literals are one example of lazy data structures ◮ But there are others 22

  47. WATCHED LITERALS [MMZZM01] ◮ Important states of a clause 23

  48. WATCHED LITERALS [MMZZM01] ◮ Important states of a clause ◮ Associate 2 references with each clause 23

  49. WATCHED LITERALS [MMZZM01] ◮ Important states of a clause ◮ Associate 2 references with each clause ◮ Deciding unit requires traversing all literals 23

  50. WATCHED LITERALS [MMZZM01] ◮ Important states of a clause ◮ Associate 2 references with each clause ◮ Deciding unit requires traversing all literals ◮ References unchanged when backtracking 23

  51. ADDITIONAL KEY TECHNIQUES ◮ Lightweight branching [e.g. MMZZM01] ◮ Use conflict to bias variables to branch on, associate score with each variable ◮ Prefer recent bias by regularly decreasing variable scores 24

  52. ADDITIONAL KEY TECHNIQUES ◮ Lightweight branching [e.g. MMZZM01] ◮ Use conflict to bias variables to branch on, associate score with each variable ◮ Prefer recent bias by regularly decreasing variable scores ◮ Clause deletion policies ◮ Not practical to keep all learned clauses ◮ Delete less used clauses [e.g. MSS96,GN02,ES03] 24

  53. ADDITIONAL KEY TECHNIQUES ◮ Lightweight branching [e.g. MMZZM01] ◮ Use conflict to bias variables to branch on, associate score with each variable ◮ Prefer recent bias by regularly decreasing variable scores ◮ Clause deletion policies ◮ Not practical to keep all learned clauses ◮ Delete less used clauses [e.g. MSS96,GN02,ES03] ◮ Proven recent techniques: ◮ Phase saving [PD07] ◮ Literal blocks distance [AS09] 24

  54. What’s hot in SAT 25

  55. WHAT’S HOT IN SAT? ◮ Clause learning techniques [e.g. ABHJS08,AS09] ◮ Clause learning is the key technique in CDCL SAT solvers ◮ Many recent papers propose improvements to the basic clause learning approach ◮ Preprocessing & inprocessing ◮ Many recent papers [e.g. JHB12,HJB11] ◮ Lots of recent work on symmetry exploitation (static/dynamic) [e.g. DBB17,JKKK17] ◮ Essential in some applications 26

  56. WHAT’S HOT IN SAT? ◮ Proofs ◮ Proof logging (RUP , RAT, DRAT) [HHKW17] ◮ Proof complexity [VEGGN18] ◮ Other Inference Methods ◮ (Probabilistic) Model counting [e.g. AHT18] ◮ Optimisation (E.g., MAXSAT – more later) [e.g. LM09] ◮ Enumeration ◮ MUSes / MCSes ◮ Applications ◮ In various domains 27

  57. Tentacles of CDCL 28

  58. SOME TENTACLES OF CDCL ◮ Lazy Clause Generation for Constraint solving or SAT modulo theories ◮ Conflict-driven pseudo-Boolean solving ◮ Incremental SAT solving for MAXSAT & QBF. 29

  59. SAT ENCODINGS ◮ Many different problems can easily be encoded into SAT ◮ For instance, finite-domain Constraint Solving ◮ Various encoding options: ◮ Equality: encode variable X ∈ [ − 100 , 100 ] by Boolean variables � X = − 100 � , � X = − 99 � , ...with uniqueness constraints ◮ Bound: encode variable X ∈ [ − 100 , 100 ] by Boolean variables � X ≤− 100 � , � X ≤− 99 � , ...with constraints � X ≤− 100 � ∨ � X ≤− 99 � , � X ≤− 99 � ∨ � X ≤− 98 � , . . . ◮ Log: encode variable X ∈ [ − 100 , 100 ] by means of bitvectors ◮ This talk assumes the Bound encoding. ◮ For each type of constraints, an encoding has to be invented 30

  60. SAT ENCODINGS – EXAMPLE X , Y , Z , U , V ∈ [ − 100 , 100 ] (1) 4 U − X − Y ≥ 4 (2) V ≥ U (3) Z ≥ 5 V (4) Y + Z ≤ 24 (5) ( � X ≤− 100 � ∨ � X ≤− 99 � ) ∧ ( � X ≤− 99 � ∨ � X ≤− 98 � ) ∧ · · · ∧ ( � Y ≤− 100 � ∨ � Y ≤− 99 � ) ∧ . . . · · · ∧ ( � X ≤− 3 � ∨ � Y ≤ 9 � ∨ � U ≤ 2 � ) ∧ · · · ∧ ( � X ≤ 9 � ∨ � Y ≤ 9 � ∨ � U ≤ 5 � ) ∧ . . . ( � V ≤ 100 � ∨ � U ≤ 100 � ) ∧ ( � V ≤ 99 � ∨ � U ≤ 99 � ) ∧ · · · ∧ ( � V ≤ 5 � ∨ � U ≤ 5 � ) ∧ . . . · · · ∧ ( � V ≤ 0 � ∨ � Z ≤ 4 � ) ∧ · · · ∧ ( � V ≤ 2 � ∨ � Z ≤ 14 � ) ∧ . . . · · · ∧ ( � Y ≤ 9 � ∨ � Z ≤ 14 � ) ∧ . . . 31

  61. CONSTRAINT PROGRAMMING USING SAT ◮ If the SAT encoding of a CP program is not too large (at least: fits in memory), we can create it eagerly and use a CDCL solver to solve it. ◮ But... we can also generate it lazily = Lazy Clause Generation (LCG) ◮ Many constraint propagators work by search + domain propagation ◮ Idea: generate parts of the encoding only when CDCL solvers needs it: ◮ During propagation ◮ During explanation 32

  62. CONSTRAINT PROGRAMMING USING SAT ◮ If the SAT encoding of a CP program is not too large (at least: fits in memory), we can create it eagerly and use a CDCL solver to solve it. ◮ But... we can also generate it lazily = Lazy Clause Generation (LCG) ◮ Many constraint propagators work by search + domain propagation ◮ Idea: generate parts of the encoding only when CDCL solvers needs it: ◮ During propagation ◮ During explanation Can use structure in constraints to learn better clauses ! Example on Blackboard 32

  63. CONSTRAINT PROGRAMMING USING SAT ◮ If the SAT encoding of a CP program is not too large (at least: fits in memory), we can create it eagerly and use a CDCL solver to solve it. ◮ But... we can also generate it lazily = Lazy Clause Generation (LCG) ◮ Many constraint propagators work by search + domain propagation ◮ Idea: generate parts of the encoding only when CDCL solvers needs it: ◮ During propagation ◮ During explanation Can use structure in constraints to learn better clauses ! Example on Blackboard ◮ Many more interesting phenomena going on in LCG (see you next week!) 32

  64. PSEUDO-BOOLEAN SOLVING Observations: ◮ Resolution proof system is weak (cfr Pigeonhole) ◮ Stronger proof systems exist, for instance cutting planes makes use of (linear) pseudo-Boolean constraints (linear constraints over literals) [CCT87] ◮ A clause a ∨ ¯ b ∨ c corresponds to a PB constraint a + ¯ b + c ≥ 1 ◮ A PB constraint a + ¯ b + 2 · c + d ≥ 2 cannot be translated into a single clause 33

  65. CUTTING PLANE PROOF SYSTEM (literal axiom) l ≥ 0 � i a i l i ≥ A � i b i l i ≥ B (linear combination) i ( ca i + db i ) l i ≥ cA + dB � � i a i l i ≥ A (division) � i ⌈ a i / c ⌉ l i ≥ ⌈ A / c ⌉ 34

  66. CUTTING PLANES VS RESOLUTION ◮ In theory, learning cutting planes could allow to derive unsat proofs much faster ◮ In practice, CDCL solvers seem to outperform cutting plane solvers ◮ Very recently, new cutting-plane solvers, inspired by CDCL are arising [GNY19] ◮ Various issues show up: generalizing CDCL, 1UIP , ... far from obvious! 35

  67. INCREMENTAL SAT SOLVING & SAT ORACLES ◮ Incremental SAT Solving: [ES01] ◮ Allow calling a solver with a set of assumptions ◮ Variables whose value is set before the search start (never backtrack over them!) ◮ Often used: replace each clause C i with C i ∨ ¬ a i ◮ a i = 1 to activate clause C i ◮ a i = 0 to deactivate clause C i ◮ Enables clause reuse 36

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend