metrics for differential privacy in concurrent systems
play

Metrics for Differential Privacy in Concurrent Systems Lili Xu 1 , 3 - PowerPoint PPT Presentation

Introduction Three Pseudometrics Comparison of the Three Pseudometrics Summary Metrics for Differential Privacy in Concurrent Systems Lili Xu 1 , 3 , 4 Konstantinos Chatzikokolakis 2 , 3 Huimin Lin 4 Catuscia Palamidessi 1 , 3 1 INRIA 2 CNRS 3


  1. Introduction Three Pseudometrics Comparison of the Three Pseudometrics Summary Metrics for Differential Privacy in Concurrent Systems Lili Xu 1 , 3 , 4 Konstantinos Chatzikokolakis 2 , 3 Huimin Lin 4 Catuscia Palamidessi 1 , 3 1 INRIA 2 CNRS 3 Ecole Polytechnique 4 Inst. of Software, Chinese Acad. of Sci. Workshop of ANR-NSFC project LOCALI, 2013 Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  2. Introduction Three Pseudometrics Comparison of the Three Pseudometrics Summary Outline Introduction 1 Concurrent Systems Differential Privacy The Verification Framework Three Pseudometrics 2 The Accumulative Bijection Pseudometric The Amortized Bijection Pseudometric A Multiplicative Variant of the Kantorovich Pseudometric Comparison of the Three Pseudometrics 3 Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  3. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary Motivation The model: Concurrent systems modeled as probabilistic automata. The measure of the level of privacy: Differential privacy Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  4. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary Motivation The model: Concurrent systems modeled as probabilistic automata. The measure of the level of privacy: Differential privacy Goal: How to verify differential privacy properties for concurrent systems? Neighboring processes have neighboring behaviors. For example: behavioural equivalences A ( u ) ≃ A ( u ′ ) = ⇒ Secrecy [Abadi and Gordon, the Spi-calculus] Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  5. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary Motivation The model: Concurrent systems modeled as probabilistic automata. The measure of the level of privacy: Differential privacy Goal: How to verify differential privacy properties for concurrent systems? Neighboring processes have neighboring behaviors. For example: behavioural equivalences A ( u ) ≃ A ( u ′ ) = ⇒ Secrecy [Abadi and Gordon, the Spi-calculus] Verification Technique Behavioural approximation Pseudometrics on states m ( A ( u ) , A ( u ′ )) = ⇒ Differential Privacy Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  6. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary Outline Introduction 1 Concurrent Systems Differential Privacy The Verification Framework Three Pseudometrics 2 The Accumulative Bijection Pseudometric The Amortized Bijection Pseudometric A Multiplicative Variant of the Kantorovich Pseudometric Comparison of the Three Pseudometrics 3 Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  7. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary Our Model A probabilistic automaton is a tuple ( S , s , A , D ) S : a finite set of states; s ∈ S : the start state; A : a finite set of action labels; a D ⊆ S × A × Disc ( S ) : a weak transition relation. We also write s = ⇒ µ . Definition (Concurrent Systems with Secret Information) Let U be a set of secrets. A concurrent system with secret information A is a mapping of secrets to probabilistic automata, where A ( u ) , u ∈ U is the automaton modelling the behavior of the system when running on u . Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  8. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary How to Reason about Probabilistic Observations? A scheduler ζ resolves the non-determinism based on the history of a computation, inducing a probability measure over traces. For each scheduler we get a fully probabilistic automaton where the probability of events (sets of traces) is defined in a standard way: Construction of a σ -algebra (for dealing with infinity). The basis is given by the finite traces and their probabilities. Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  9. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary How to Reason about Probabilistic Observations? A scheduler ζ resolves the non-determinism based on the history of a computation, inducing a probability measure over traces. For each scheduler we get a fully probabilistic automaton where the probability of events (sets of traces) is defined in a standard way: Construction of a σ -algebra (for dealing with infinity). The basis is given by the finite traces and their probabilities. Probabilities of finite traces Let α be the history up to the current state s . The probability of observing a finite trace � t starting from α , denoted by Pr ζ [ α ⊲ � t ] , is defined recursively as follows.  if � 1 t is empty,   ζ [ α ⊲ � b if � t = a � � Pr t ] = 0 t ′ , ζ ( α ) = s = ⇒ µ and b � = a , t ′ and ζ ( α ) = s a s i µ ( s i ) Pr ζ [ α as i ⊲ � if � t = a � �  t ′ ] � = ⇒ µ .  Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  10. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary An example: A PIN-Checking System A ( u ) A ( u ) u 1 u 2 A ( u 1 ) A ( u 2 ) a 1 a 2 a 1 a 2 0 . 4 0 . 6 0 . 4 0 . 6 0 . 6 0 . 4 0 . 6 0 . 4 s 1 s 2 s 3 t 1 t 2 t 3 no no no no ok ok Example: The scheduler executes the a 1 -branch. Pr ζ [ A ( u 1 ) ⊲ a 1 ok ] = 0 . 6 Pr ζ [ A ( u 2 ) ⊲ a 1 ok ] = 0 . 4 Pr ζ [ A ( u 1 ) ⊲ a 1 no ] = Pr ζ [ A ( u 2 ) ⊲ a 1 no ] = 0 . 4 0 . 6 Pr ζ [ A ( u 1 ) ⊲ a 2 ok ] = 0 Pr ζ [ A ( u 2 ) ⊲ a 2 ok ] = 0 Pr ζ [ A ( u 1 ) ⊲ a 2 no ] = 0 Pr ζ [ A ( u 2 ) ⊲ a 2 no ] = 0 Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  11. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary Outline Introduction 1 Concurrent Systems Differential Privacy The Verification Framework Three Pseudometrics 2 The Accumulative Bijection Pseudometric The Amortized Bijection Pseudometric A Multiplicative Variant of the Kantorovich Pseudometric Comparison of the Three Pseudometrics 3 Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  12. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary How To Quantify the Amount of Privacy? Definition (Standard Definition of Differential Privacy) A query mechanism A is ǫ -differentially private if for any two adjacent databases u 1 and u 2 , i.e. which differ only for one individual, and any property Z , the probability distributions of A ( u 1 ) , A ( u 2 ) differ on Z at most by e ǫ , namely, Pr [ A ( u 1 ) ∈ Z ] ≤ e ǫ · Pr [ A ( u 2 ) ∈ Z ] . The lower the value ǫ is, the better the privacy is protected. Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  13. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary How To Quantify the Amount of Privacy? Definition (Standard Definition of Differential Privacy) A query mechanism A is ǫ -differentially private if for any two adjacent databases u 1 and u 2 , i.e. which differ only for one individual, and any property Z , the probability distributions of A ( u 1 ) , A ( u 2 ) differ on Z at most by e ǫ , namely, Pr [ A ( u 1 ) ∈ Z ] ≤ e ǫ · Pr [ A ( u 2 ) ∈ Z ] . The lower the value ǫ is, the better the privacy is protected. Some Merits of Differential Privacy Strong notion of privacy. Independence from side knowledge. Robustness to attacks based on combining various sources of information. Looser restrictions between non-adjacent secrets. Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

  14. Introduction Concurrent Systems Three Pseudometrics Differential Privacy Comparison of the Three Pseudometrics The Verification Framework Summary Differential Privacy in the Context of Concurrent Systems The scheduler can easily break many security and privacy properties. We consider a restricted class of schedulers, called admissible schedulers. On related states, an admissible scheduler should choose the same transition label. Definition (Differential Privacy in Our Setting) A concurrent system A satisfies ǫ - differential privacy (DP) iff for any two adjacent secrets u , u ′ , all finite traces � t and all admissible schedulers ζ : t ] ≤ e ǫ · Pr ζ [ A ( u ) ⊲ � ζ [ A ( u ′ ) ⊲ � Pr t ] Xu, Chatzikokolakis, Lin, Palamidessi Metrics for Differential Privacy in Concurrent Systems

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend