counting and sampling solutions of sat smt
play

Counting and Sampling Solutions of SAT/SMT Constraints Supratik - PowerPoint PPT Presentation

Counting and Sampling Solutions of SAT/SMT Constraints Supratik Chakraborty (IIT Bombay) Joint work with Kuldeep S. Meel and Moshe Y. Vardi (Rice University) [Extended version of slides presented at SAT/SMT/AR Summer School 2016, Lisbon]


  1. Counting and Sampling Solutions of SAT/SMT Constraints Supratik Chakraborty (IIT Bombay) Joint work with Kuldeep S. Meel and Moshe Y. Vardi (Rice University) [Extended version of slides presented at SAT/SMT/AR Summer School 2016, Lisbon]

  2. Problem Definition • Given  X 1 , … X n : variables with finite discrete domains D 1 , … D n  Constraint (logical formula) F over X 1 , … X n  Weight function W: D 1  … D n    0 Let R F : set of assignments of X 1 , … X n that satisfy F  Determine W(R F ) =  y  RF W(y) Discrete Integration (Model Counting) If W(y) = 1 for all y, then W(R F ) = | R F |  Randomly sample from R F such that Pr[y is sampled]  W(y) If W(y) = 1 for all y, then uniformly sample from R F Discrete Sampling 1 Suffices to consider all domains as {0, 1}: assume for this tutorial

  3. Discrete Integration: An Application • Probabilistic Inference  An alarm rings if it’s in a working state when an earthquake happens or a burglary happens  The alarm can malfunction and ring without earthquake or burglary happening  Given that the alarm rang, what is the likelihood that an earthquake happened?  Given conditional dependencies (and conditional probabilities) calculate Pr[event | evidence]  What is Pr [Earthquake | Alarm] ? 2

  4. Discrete Integration: An Application Probabilistic Inference: Bayes ’ rule to the rescue   Pr[ ] Pr[ ] event evidence event evidence   i i Pr[ | ] event evidence   i Pr[ ] Pr[ ] evidence event evidence j j    Pr[ ] Pr[ | ] Pr[ ] event evidence evidence event event j j j How do we represent conditional dependencies efficiently, and calculate these probabilities? 3

  5. Discrete Integration: An Application Probabilistic Graphical Models B E B E A Pr(A|E,B) A 4 Conditional Probability Tables (CPT)

  6. Discrete Integration: An Application B E B E A Pr(A|E,B) A Pr 𝐹 ∩ 𝐵 = Pr 𝐹 ∗ Pr ¬𝐶 ∗ Pr 𝐵 𝐹, ¬𝐶 + Pr 𝐹 ∗ Pr 𝐶 ∗ Pr[𝐵|𝐹, 𝐶] 5

  7. Discrete Integration: An Application • Probabilistic Inference: From probabilities to logic V = {v A , v ~A , v B , v ~B , v E , v ~E } Prop vars corresponding to events T = {t A|B,E , t ~A|B,E , t A|B,~E …} Prop vars corresponding to CPT entries Formula encoding probabilistic graphical model (  PGM ): (v A  v ~A )  (v B  v ~B )  (v E  v ~E ) Exactly one of v A and v ~A is true  (t A|B,E  v A  v B  v E )  (t ~A|B,E  v ~A  v B  v E )  … If v A , v B , v E are true, so must t A|B,E and vice versa 6

  8. Discrete Integration: An Application • Probabilistic Inference: From probabilities to logic and weights V = {v A , v ~A , v B , v ~B , v E , v ~E } T = {t A|B,E , t ~A|B,E , t A|B,~E …} W(v ~B ) = 0.2, W(v B ) = 0.8 Probabilities of indep events are weights of +ve literals W(v ~E ) = 0.1, W(v E ) = 0.9 W(t A|B,E ) = 0.3, W(t ~A|B,E ) = 0.7, … CPT entries are weights of +ve literals W(v A ) = W(v ~A ) = 1 Weights of vars corresponding to dependent events W(  v ~B ) = W(  v B ) = W(  t A|B,E ) … = 1 Weights of -ve literals are all 1 Weight of assignment (v A = 1, v ~A = 0, t A|B,E = 1, …) = W( v A ) * W(  v ~A )* W( t A|B,E )* … 7 Product of weights of literals in assignment

  9. Discrete Integration: An Application • Probabilistic Inference: From probabilities to logic and weights V = {v A , v ~A , v B , v ~B , v E , v ~E } T = {t A|B,E , t ~A|B,E , t A|B,~E …} Formula encoding combination of events in probabilistic model (Alarm and Earthquake) F =  PGM  v A  v E Set of satisfying assignments of F: R F = { (v A = 1, v E = 1, v B = 1, t A|B,E = 1, all else 0), (v A = 1, v E = 1, v ~B = 1, t A|~B,E = 1, all else 0) } Weight of satisfying assignments of F: W(R F ) = W(v A ) * W(v E ) * W(v B ) * W(t A|B,E ) + W(v A ) * W(v E ) * W(v ~B ) * W(t A|~B,E ) 8 = 1* Pr[E] * Pr[B] * Pr[A | B,E] + 1* Pr[E] * Pr[~B] * Pr[A | ~B,E] = Pr[ A ∩ E]

  10. Discrete Integration: An Application From probabilistic inference to unweighted model counting B Weighted E Pr[𝐹|𝐵] Model Counting Roth 1996 A Weighted Model Counting Unweighted Model Counting IJCAI 2015 9 Reduction polynomial in #bits representing CPT entries

  11. Discrete Sampling: An Application Functional Verification • Formal verification  Challenges: formal requirements, scalability  ~10-15% of verification effort • Dynamic verification: dominant approach 10

  12. Discrete Sampling: An Application  Design is simulated with test vectors • Test vectors represent different verification scenarios  Results from simulation compared to intended results  How do we generate test vectors? Challenge : Exceedingly large test input space! Can’t try all input combinations 2 128 combinations for a 64-bit binary operator!!! 11

  13. Discrete Sampling: An Application Sources for Constraints b a • Designers: 64 bit 64 bit 1. a + 64 11 * 32 b = 12 2. a < 64 (b >> 4) • Past Experience: c = f(a,b) 1. 40 < 64 34 + a < 64 5050 2. 120 < 64 b < 64 230 • Users: 64 bit 1. 232 * 32 a + b != 1100 c 2. 1020 < 64 (b / 64 2) + 64 a < 64 2200  Test vectors: solutions of constraints 12  Proposed by Lichtenstein, Malka, Aharon (IAAI 94)

  14. Discrete Sampling: An Application Constraints • Designers: b a 1. a + 64 11 * 32 b = 12 64 bit 64 bit 2. a < 64 (b >> 4) • Past Experience: c = f(a,b) 1. 40 < 64 34 + a < 64 5050 2. 120 < 64 b < 64 230 • Users: 64 bit 1. 232 * 32 a + b != 1100 2. 1020 < 64 (b / 64 2) + 64 a < 64 2200 c Modern SAT/SMT solvers are complex systems Efficiency stems from the solver automatically “biasing” search Fails to give unbiased or user-biased distribution of test vectors 13

  15. Discrete Sampling: An Application Constrained Random Verification b a Set of Constraints 64 bit 64 bit c = f(a,b) SAT Formula 64 bit Sample satisfying assignments c uniformly at random Scalable Uniform Generation of SAT Witnesses 14

  16. Discrete Integration and Sampling • Many, many more applications  Physics, economics, network reliability estimation, … • Discrete integration and discrete sampling are closely related  Insights into solving one efficiently and approximately can often be carried over to solving the other  More coming in subsequent slides … 15

  17. Agenda (Part I) • Hardness of counting/integration and sampling • Early work on counting and sampling • Universal hashing • Universal-hashing based algorithms: an overview 16

  18. How Hard is it to Count/Sample? • Trivial if we could enumerate R F : Almost always impractical • Computational complexity of counting (discrete integration): Exact unweighted counting: #P-complete [Valiant 1978] Approximate unweighted counting: Deterministic: Polynomial time det. Turing Machine with  2 p oracle [Stockmeyer 1983] | | R         F DetEstimat e(F, ) | | ( 1 ), for 0 R   F 1 Randomized: Polynomial time probabilistic Turing Machine with NP oracle [Stockmeyer 1983; Jerrum,Valiant,Vazirani 1986]   | | R                Pr F RandEstima te(F, , ) | | ( 1 ) 1 , for 0 , 0 1 R     F   1 Probably Approximately Correct (PAC) algorithm Weighted versions of counting: Exact: #P-complete [Roth 1996], Approximate: same class as unweighted version [follows from Roth 1996] 17

  19. How Hard is it to Count/Sample? • Computational complexity of sampling: Uniform sampling: Polynomial time prob. Turing Machine with NP oracle [Bellare,Goldreich,Petrank 2000]    0 if R c y   F  Pr[ UniformGen erator(F)] , where y c    0 and indep of if R c y y F Almost uniform sampling: Polynomial time prob. Turing Machine with NP oracle [Jerrum,Valiant,Vazirani 1986, also from Bellare,Goldreich,Petrank 2000]    0 if R c y c        F  Pr[ AUGenerato r(F, )] ( 1 ) , where y c     1  0 and indep of if R c y y F Pr[Algorithm outputs some y]  ½, if F is satisfiable 18

  20. Exact Counters • DPLL based counters [CDP: Birnbaum,Lozinski 1999]  DPLL branching search procedure, with partial truth assignments  Once a branch is found satisfiable, if t out of n variables assigned, add 2 n-t to model count, backtrack to last decision point, flip decision and continue  Requires data structure to check if all clauses are satisfied by partial assignment Usually not implemented in modern DPLL SAT solvers  Can output a lower bound at any time 19

  21. Exact Counters • DPLL + component analysis [RelSat: Bayardo, Pehoushek 2000]  Constraint graph G: Variables of F are vertices An edge connects two vertices if corresponding variables appear in some clause of F  Disjoint components of G lazily identified during DPLL search  F1, F2, … Fn : subformulas of F corresponding to components |R F | = |R F1 | * |R F2 | * |R F3 | * …  Heuristic optimizations: Solve most constrained sub-problems first Solving sub-problems in interleaved manner 20

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend