relational reasoning via probabilistic coupling
play

Relational reasoning via probabilistic coupling Gilles Barthe, - PowerPoint PPT Presentation

Relational reasoning via probabilistic coupling Gilles Barthe, Thomas Espitau, Benjamin Grgoire, Justin Hsu, Lo Stefanesco, Pierre-Yves Strub IMDEA Software, ENS Cachan, ENS Lyon, Inria, University of Pennsylvania November 28, 2015 1


  1. Relational reasoning via probabilistic coupling Gilles Barthe, Thomas Espitau, Benjamin Grégoire, Justin Hsu, Léo Stefanesco, Pierre-Yves Strub IMDEA Software, ENS Cachan, ENS Lyon, Inria, University of Pennsylvania November 28, 2015 1

  2. Relational properties Properties about two runs of the same program ◮ Assume inputs are related by Ψ ◮ Want to prove the outputs are related by Φ 2

  3. Examples Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : out 1 ≤ out 2 ◮ “Bigger inputs give bigger outputs” 3

  4. Examples Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : out 1 ≤ out 2 ◮ “Bigger inputs give bigger outputs” Non-interference ◮ Ψ : low 1 = low 2 ◮ Φ : out 1 = out 2 ◮ “If low-security inputs are the same, then outputs are the same” 3

  5. Probabilistic relational properties Richer properties ◮ Differential privacy ◮ Cryptographic indistinguishability 4

  6. Probabilistic relational properties Richer properties ◮ Differential privacy ◮ Cryptographic indistinguishability Verification tool: pRHL [BGZ-B] ◮ Imperative while language + command for random sampling ◮ Deterministic input, randomized output ◮ Hoare-style logic 4

  7. Inspiration from probability theory Probabilistic couplings ◮ Used by mathematicians for proving relational properties ◮ Applications: Markov chains, probabilistic processes Idea ◮ Place two processes in the same probability space ◮ Coordinate the sampling 5

  8. Our results Main observation The logic pRHL internalizes coupling 6

  9. Our results Main observation The logic pRHL internalizes coupling Consequences ◮ Constructing pRHL proof → constructing a coupling ◮ Can verify classic examples of couplings in mathematics with proof assistant EasyCrypt (built on pRHL) 6

  10. The plan Today ◮ Introducing probabilistic couplings ◮ Introducing the relational logic pRHL ◮ Example: convergence of random walks 7

  11. Probabilistic couplings 8

  12. Introducing to probabilistic couplings Basic ingredients ◮ Given: two distributions X 1 , X 2 over set A ◮ Produce: joint distribution Y over A × A – Distribution over the first component is X 1 – Distribution over the second component is X 2 9

  13. Introducing to probabilistic couplings Basic ingredients ◮ Given: two distributions X 1 , X 2 over set A ◮ Produce: joint distribution Y over A × A – Distribution over the first component is X 1 – Distribution over the second component is X 2 Definition Given two distributions X 1 , X 2 over a set A , a coupling Y is a distribution over A × A such that π 1 ( Y ) = X 1 and π 2 ( Y ) = X 2 . 9

  14. 1/2 1/2 Example: mirrored random walks Simple random walk on integers ◮ Start at position p = 0 ◮ Each step, flip coin x $ ← flip ◮ Heads: p ← p + 1 ◮ Tails: p ← p − 1 10

  15. Example: mirrored random walks Simple random walk on integers ◮ Start at position p = 0 ◮ Each step, flip coin x $ ← flip ◮ Heads: p ← p + 1 ◮ Tails: p ← p − 1 1/2 1/2 Figure: Simple random walk 10

  16. Coupling the walks to meet Case p 1 = p 2 : Walks have met ◮ Arrange samplings x 1 = x 2 ◮ Continue to have p 1 = p 2 11

  17. Coupling the walks to meet Case p 1 = p 2 : Walks have met ◮ Arrange samplings x 1 = x 2 ◮ Continue to have p 1 = p 2 Case p 1 � = p 2 : Walks have not met ◮ Arrange samplings x 1 = ¬ x 2 ◮ Walks make mirror moves 11

  18. Coupling the walks to meet Case p 1 = p 2 : Walks have met ◮ Arrange samplings x 1 = x 2 ◮ Continue to have p 1 = p 2 Case p 1 � = p 2 : Walks have not met ◮ Arrange samplings x 1 = ¬ x 2 ◮ Walks make mirror moves Under coupling, if walks meet, they move together 11

  19. Why is this interesting? Goal: memorylessness ◮ Start two random walks at w and w + 2 k ◮ To show: position distributions converge as we take more steps 12

  20. Why is this interesting? Goal: memorylessness ◮ Start two random walks at w and w + 2 k ◮ To show: position distributions converge as we take more steps Coupling bounds distance between distributions ◮ Once walks meet, they stay equal ◮ Distance is at most probability walks don’t meet 12

  21. Why is this interesting? Goal: memorylessness ◮ Start two random walks at w and w + 2 k ◮ To show: position distributions converge as we take more steps Coupling bounds distance between distributions ◮ Once walks meet, they stay equal ◮ Distance is at most probability walks don’t meet Theorem If Y is a coupling of two distributions ( X 1 , X 2 ) , then � � X 1 − X 2 � TV � | X 1 ( a ) − X 2 ( a ) | ≤ ( y 1 , y 2 ) ∼ Y [ y 1 � = y 2 ] . Pr a ∈ A 12

  22. The logic pRHL 13

  23. The program logic pRHL Probabilistic Relational Hoare Logic ◮ Hoare-style logic for probabilistic relational properties ◮ Proposed by Barthe, Grégoire, Zanella-Béguelin ◮ Implemented in the EasyCrypt proof assistant for crypto proofs 14

  24. Language and judgments The pWhile imperative language c ::= x ← e | x ← d | if e then c else c | while e do c | skip | c ; c $ 15

  25. Language and judgments The pWhile imperative language c ::= x ← e | x ← d | if e then c else c | while e do c | skip | c ; c $ 15

  26. Language and judgments The pWhile imperative language c ::= x ← e | x ← d | if e then c else c | while e do c | skip | c ; c $ Basic pRHL judgments � c 1 ∼ c 2 : Ψ ⇒ Φ ◮ Ψ and Φ are formulas over labeled program variables x 1 , x 2 ◮ Ψ is precondition, Φ is postcondition 15

  27. Interpreting the judgment � c 1 ∼ c 2 : Ψ ⇒ Φ 16

  28. Interpreting the judgment � c 1 ∼ c 2 : Ψ ⇒ Φ Interpreting pre- and post-conditions ◮ Ψ interpreted as a relation on two memories ◮ Φ interpreted as a relation Φ † on distributions over memories 16

  29. Interpreting the judgment � c 1 ∼ c 2 : Ψ ⇒ Φ Interpreting pre- and post-conditions ◮ Ψ interpreted as a relation on two memories ◮ Φ interpreted as a relation Φ † on distributions over memories Definition (Couplings in disguise!) If Φ is a relation on A , the lifted relation Φ † is a relation on Distr ( A ) where µ 1 Φ † µ 2 if there exists µ ∈ Distr ( A × A ) with ◮ supp( µ ) ⊆ Φ; and ◮ π 1 ( µ ) = µ 1 and π 2 ( µ ) = µ 2 . 16

  30. Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes 17

  31. Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes 17

  32. Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes ◮ Bijection f : specifies how to coordinate the samples 17

  33. Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes ◮ Bijection f : specifies how to coordinate the samples 17

  34. Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes ◮ Bijection f : specifies how to coordinate the samples ◮ Side condition: marginals are preserved under f 17

  35. Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes ◮ Bijection f : specifies how to coordinate the samples ◮ Side condition: marginals are preserved under f 17

  36. Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes ◮ Bijection f : specifies how to coordinate the samples ◮ Side condition: marginals are preserved under f ◮ Assume: samples coupled when proving postcondition Φ 17

  37. Examples 18

  38. Example: mirroring random walks in pRHL The code pos ← start; // Start position i ← 0; H ← []; // Ghost code while i < N do $ b ← flip; H ← b :: H; // Ghost code if b then pos ← pos + 1; else pos ← pos - 1; fi i ← i + 1; end return pos // Final position 19

  39. Example: mirroring random walks in pRHL The code pos ← start; // Start position i ← 0; H ← []; // Ghost code while i < N do $ b ← flip; H ← b :: H; // Ghost code if b then pos ← pos + 1; else pos ← pos - 1; fi i ← i + 1; end return pos // Final position Goal: couple two walks via mirroring 19

  40. Record the history H stores history of flips ◮ Σ( H ) is the net distance that the first process moves to the right ◮ Meet ( H ) if there is prefix H’ of H with Σ( H’ ) = k 20

  41. Specify the coupling Sampling rule f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ 21

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend