metrics for differential privacy in concurrent systems
play

Metrics for Differential Privacy in Concurrent Systems Lili Xu 1 , 3 - PowerPoint PPT Presentation

Introduction Three Pseudometrics Summary Metrics for Differential Privacy in Concurrent Systems Lili Xu 1 , 3 , 4 Konstantinos Chatzikokolakis 2 , 3 Huimin Lin 4 Catuscia Palamidessi 1 , 3 1 INRIA 2 CNRS 3 Ecole Polytechnique 4 Institute of


  1. Introduction Three Pseudometrics Summary Metrics for Differential Privacy in Concurrent Systems Lili Xu 1 , 3 , 4 Konstantinos Chatzikokolakis 2 , 3 Huimin Lin 4 Catuscia Palamidessi 1 , 3 1 INRIA 2 CNRS 3 Ecole Polytechnique 4 Institute of Software, Chinese Academy of Sciences HotSpot 2014 Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  2. Introduction Three Pseudometrics Summary Outline Introduction 1 Concurrent Systems Differential Privacy The Verification Framework Three Pseudometrics 2 The Accumulative Bijection Pseudometric The Amortised Bijection Pseudometric A Multiplicative Variant of the Kantorovich Pseudometric Comparison Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  3. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework Motivation The model: Concurrent systems modeled as probabilistic automata. The measure of the level of privacy: Differential privacy Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  4. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework Motivation The model: Concurrent systems modeled as probabilistic automata. The measure of the level of privacy: Differential privacy Goal: To verify differential privacy properties for concurrent systems Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  5. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework Outline Introduction 1 Concurrent Systems Differential Privacy The Verification Framework Three Pseudometrics 2 The Accumulative Bijection Pseudometric The Amortised Bijection Pseudometric A Multiplicative Variant of the Kantorovich Pseudometric Comparison Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  6. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework Our Model A probabilistic automaton is a tuple ( S , s , A , D ) S : a finite set of states; s ∈ S : the start state; A : a finite set of action labels; a D ⊆ S × A × Disc ( S ) : a transition relation. We also write s − → µ . Definition (Concurrent Systems with Secret Information) Let U be a set of secrets. A concurrent system with secret information A is a mapping of secrets to probabilistic automata, where A ( u ) , u ∈ U is the automaton modelling the behavior of the system when running on u . Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  7. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework How to Reason about Probabilistic Observations? A scheduler ζ resolves the non-determinism based on the history of a computation, inducing a probability measure over traces. Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  8. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework How to Reason about Probabilistic Observations? A scheduler ζ resolves the non-determinism based on the history of a computation, inducing a probability measure over traces. Probabilities of finite traces Let α be the history up to the current state s . The probability of observing a finite trace � t starting from α , denoted by Pr ζ [ α ⊲ � t ] , is defined recursively as follows. if �  1 t is empty,   b ζ [ α ⊲ � if � t = a � � t ′ , ζ ( α ) = s Pr t ] = 0 − → µ and b � = a , t ′ and ζ ( α ) = s a s i µ ( s i ) Pr ζ [ α as i ⊲ � if � t = a � �  t ′ ] � − → µ .  Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  9. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework An example: A PIN-Checking System A ( u ) A ( u ) u 1 u 2 A ( u 1 ) A ( u 2 ) a 1 a 2 a 1 a 2 0 . 4 0 . 6 0 . 4 0 . 6 0 . 6 0 . 4 0 . 6 0 . 4 s 1 s 2 s 3 t 1 t 2 t 3 no no no no ok ok Example: The scheduler executes the a 1 -branch. Pr ζ [ A ( u 1 ) ⊲ a 1 ok ] = 0 . 6 Pr ζ [ A ( u 2 ) ⊲ a 1 ok ] = 0 . 4 Pr ζ [ A ( u 1 ) ⊲ a 1 no ] = Pr ζ [ A ( u 2 ) ⊲ a 1 no ] = 0 . 4 0 . 6 Pr ζ [ A ( u 1 ) ⊲ a 2 ok ] = 0 Pr ζ [ A ( u 2 ) ⊲ a 2 ok ] = 0 Pr ζ [ A ( u 1 ) ⊲ a 2 no ] = 0 Pr ζ [ A ( u 2 ) ⊲ a 2 no ] = 0 Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  10. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework Outline Introduction 1 Concurrent Systems Differential Privacy The Verification Framework Three Pseudometrics 2 The Accumulative Bijection Pseudometric The Amortised Bijection Pseudometric A Multiplicative Variant of the Kantorovich Pseudometric Comparison Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  11. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework How To Quantify the Amount of Privacy? Definition (Standard Definition of Differential Privacy) A query mechanism A is ǫ -differentially private if for any two adjacent databases u 1 and u 2 , i.e. which differ only for one individual, and any property Z , the probability distributions of A ( u 1 ) , A ( u 2 ) differ on Z at most by e ǫ , namely, Pr [ A ( u 1 ) ∈ Z ] ≤ e ǫ · Pr [ A ( u 2 ) ∈ Z ] . The lower the value ǫ is, the better the privacy is protected. Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  12. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework How To Quantify the Amount of Privacy? Definition (Standard Definition of Differential Privacy) A query mechanism A is ǫ -differentially private if for any two adjacent databases u 1 and u 2 , i.e. which differ only for one individual, and any property Z , the probability distributions of A ( u 1 ) , A ( u 2 ) differ on Z at most by e ǫ , namely, Pr [ A ( u 1 ) ∈ Z ] ≤ e ǫ · Pr [ A ( u 2 ) ∈ Z ] . The lower the value ǫ is, the better the privacy is protected. Some Merits of Differential Privacy Strong notion of privacy. Independence from side knowledge. Robustness to attacks based on combining various sources of information. Looser restrictions between non-adjacent secrets. Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  13. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework Differential Privacy in the Context of Concurrent Systems The scheduler can easily break many security and privacy properties. We consider a restricted class of schedulers, called admissible schedulers. make them unable to distinguish between secrets in the histories. Definition (Differential Privacy in Our Setting) A concurrent system A satisfies ǫ - differential privacy (DP) iff for any two adjacent secrets u , u ′ , all finite traces � t and all admissible schedulers ζ : t ] ≤ e ǫ · Pr ζ [ A ( u ) ⊲ � ζ [ A ( u ′ ) ⊲ � Pr t ] Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  14. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework The PIN-Checking System Revisited Definition (Differential Privacy in Our Setting) A concurrent system A satisfies ǫ - differential privacy (DP) iff for any two adjacent secrets u , u ′ , all finite traces � t and all admissible schedulers ζ : t ] ≤ e ǫ · Pr ζ [ A ( u ) ⊲ � ζ [ A ( u ′ ) ⊲ � Pr t ] Example Pr ζ [ A ( u 1 ) ⊲ a 1 ok ] = Pr ζ [ A ( u 2 ) ⊲ a 1 ok ] = 0 . 6 0 . 4 Pr ζ [ A ( u 1 ) ⊲ a 1 no ] = 0 . 4 Pr ζ [ A ( u 2 ) ⊲ a 1 no ] = 0 . 6 Pr ζ [ A ( u 1 ) ⊲ a 2 ok ] = 0 Pr ζ [ A ( u 2 ) ⊲ a 2 ok ] = 0 Pr ζ [ A ( u 1 ) ⊲ a 2 no ] = 0 Pr ζ [ A ( u 2 ) ⊲ a 2 no ] = 0 In this case, the level of differential privacy ǫ = ln 3 2 . Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  15. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework Outline Introduction 1 Concurrent Systems Differential Privacy The Verification Framework Three Pseudometrics 2 The Accumulative Bijection Pseudometric The Amortised Bijection Pseudometric A Multiplicative Variant of the Kantorovich Pseudometric Comparison Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  16. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Summary The Verification Framework Neighboring processes have neighboring behaviors. For example: behavioural equivalences A ( u ) ≃ A ( u ′ ) = ⇒ Secrecy [Abadi and Gordon, the Spi-calculus] The property of differential privacy requires that the observations generated by two adjacent secrets are probabilistically close. Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend