Engineering privacy by design Privacy by Design Let's have it! - - PowerPoint PPT Presentation

engineering privacy by design privacy by design let s
SMART_READER_LITE
LIVE PREVIEW

Engineering privacy by design Privacy by Design Let's have it! - - PowerPoint PPT Presentation

Engineering privacy by design Privacy by Design Let's have it! Information and Privacy Commissioner of Ontario https://www.ipc.on.ca/images/resources/7foundationalprinciples.pdf Privacy by Design Let's have it! Article 25 European


slide-1
SLIDE 1

Engineering privacy by design

slide-2
SLIDE 2

Privacy by Design – Let's have it!

https://www.ipc.on.ca/images/resources/7foundationalprinciples.pdf

Information and Privacy Commissioner of Ontario

slide-3
SLIDE 3

Privacy by Design – Let's have it!

https://www.ipc.on.ca/images/resources/7foundationalprinciples.pdf http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679&from=EN

Information and Privacy Commissioner of Ontario Article 25 European General Data Protection Regulation “the controller shall [...] implement appropriate technical and

  • rganisational measures […] which are designed to implement

data-protection principles[...] in order to meet the requirements of this Regulation and protect the rights of data subjects.”

slide-4
SLIDE 4

Privacy by Design – Let's have it!

Information and Privacy Commissioner of Ontario Article 25 European General Data Protection Regulation “the controller shall [...] implement appropriate technical and

  • rganisational measures […] which are designed to implement

data-protection principles[...] in order to meet the requirements of this Regulation and protect the rights of data subjects.”

slide-5
SLIDE 5

Privacy by Design – Let's have it!

Information and Privacy Commissioner of Ontario Article 25 European General Data Protection Regulation “the controller shall [...] implement appropriate technical and

  • rganisational measures […] which are designed to implement

data-protection principles[...] in order to meet the requirements of this Regulation and protect the rights of data subjects.”

slide-6
SLIDE 6

This talk: Engineering Privacy by Design

PART I: Reasoning about Privacy when designing systems

slide-7
SLIDE 7

PART I: Reasoning about Privacy when designing systems

slide-8
SLIDE 8

PRIVA VACY PRIVA VACY

slide-9
SLIDE 9

PRIVA VACY PRIVA VACY

slide-10
SLIDE 10

Two case studies:

anonymous e-petitions: no identity attached to petitions

privacy-preserving road tolling: no fjne grained data sent to server

Engineering Privacy by Design 1.0

Seda Gurses, Carmela Troncoso, Claudia Diaz. Engineering Privacy by Design.Computers, Privacy & Data Protection. 2011

slide-11
SLIDE 11

Two case studies:

anonymous e-petitions: no identity attached to petitions

privacy-preserving road tolling: no fjne grained data sent to server

Engineering Privacy by Design 1.0

The Key is “data minimization”

Seda Gurses, Carmela Troncoso, Claudia Diaz. Engineering Privacy by Design.Computers, Privacy & Data Protection. 2011

slide-12
SLIDE 12

Two case studies:

➢ anonymous e-petitions: no identity attached to petitions ➢ privacy-preserving road tolling: no fjne grained data sent to server

but, it’s not “data” that is minimized (in the system as a whole)

➢ kept in user devices ➢ sent encrypted to a server (only client has the key) ➢ distributed over multiple servers: only the user, or colluding servers, can

recover the data

Engineering Privacy by Design 1.0

The Key is “data minimization”

Seda Gurses, Carmela Troncoso, Claudia Diaz. Engineering Privacy by Design.Computers, Privacy & Data Protection. 2011

slide-13
SLIDE 13

Two case studies:

➢ anonymous e-petitions: no identity attached to petitions ➢ privacy-preserving road tolling: no fjne grained data sent to server

but, it’s not “data” that is minimized (in the system as a whole)

➢ kept in user devices ➢ sent encrypted to a server (only client has the key) ➢ distributed over multiple servers: only the user, or colluding servers, can

recover the data

Engineering Privacy by Design 1.0

The Key is “data minimization”

Seda Gurses, Carmela Troncoso, Claudia Diaz. Engineering Privacy by Design.Computers, Privacy & Data Protection. 2011

“data minimization” is a bad metaphor!!!

slide-14
SLIDE 14

Unpacking “Data Minimization”: Privacy by Design Strategies

Seda Gurses, Carmela Troncoso, Claudia Diaz. Engineering Privacy by Design Reloaded. Amsterdam Privacy Conference. 2015

Minimizing privacy risks and trust assumptions placed on other entities

Overarching goal

slide-15
SLIDE 15

Unpacking “Data Minimization”: Privacy by Design Strategies

Minimizing privacy risks and trust assumptions placed on other entities

Overarching goal

slide-16
SLIDE 16

Unpacking “Data Minimization”: Privacy by Design Strategies

Minimizing privacy risks and trust assumptions placed on other entities

Overarching goal

slide-17
SLIDE 17

Unpacking “Data Minimization”: Privacy by Design Strategies

Minimizing privacy risks and trust assumptions placed on other entities

Overarching goal

slide-18
SLIDE 18

Unpacking “Data Minimization”: Privacy by Design Strategies

Minimizing privacy risks and trust assumptions placed on other entities

Overarching goal

slide-19
SLIDE 19

Unpacking “Data Minimization”: Privacy by Design Strategies

Minimizing privacy risks and trust assumptions placed on other entities

Overarching goal

slide-20
SLIDE 20

Unpacking “Data Minimization”: Privacy by Design Strategies

Minimizing privacy risks and trust assumptions placed on other entities

Overarching goal

slide-21
SLIDE 21

Case study: Electronic Toll Pricing

Commission Decision of 6 October 2009 on the defjnition of the European Electronic Toll Service and its technical elements http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32009D0750

Motivati tion: European Electronic Toll Service (EETS)

Toll collection on European Roads trough On Board Equipment Two approaches: Satellite Technology / DSRC

slide-22
SLIDE 22

Sta tarting assumptions 1) Well defjned functionality Charge depending on driving 2) Security, privacy & service integrity requirements Users location should be private No cheating clients 3) Initial reference system

Case study: Electronic Toll Pricing

Commission Decision of 6 October 2009 on the defjnition of the European Electronic Toll Service and its technical elements http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32009D0750

Motivati tion: European Electronic Toll Service (EETS)

Toll collection on European Roads trough On Board Equipment Two approaches: Satellite Technology / DSRC

slide-23
SLIDE 23

Sta tarting assumptions 1) Well defjned functionality Charge depending on driving

Case study: Electronic Toll Pricing

Motivati tion: European Electronic Toll Service (EETS)

Toll collection on European Roads trough On Board Equipment Two approaches: Satellite Technology / DSRC

slide-24
SLIDE 24

Case study: Electronic Toll Pricing

To Toll Authority

Activity 1: Classify Entities in domains

User domain: components under the control of the user, eg, user devices Service domain: components outside the control of the user, eg, backend system at provider

slide-25
SLIDE 25

Case study: Electronic Toll Pricing

Activity 1: Classify Entities in domains

User domain: components under the control of the user, eg, user devices Service domain: components outside the control of the user, eg, backend system at provider

slide-26
SLIDE 26

Case study: Electronic Toll Pricing

Activity 1: Classify Entities in domains

User domain: components under the control of the user, eg, user devices Service domain: components outside the control of the user, eg, backend system at provider

Activity 2: Identify necessary data for providing the service Location data – compute bill Billing data – charge user Personal data – send bill Payment data – perform payment

slide-27
SLIDE 27

Case study: Electronic Toll Pricing

Activity 1: Classify Entities in domains

User domain: components under the control of the user, eg, user devices Service domain: components outside the control of the user, eg, backend system at provider

slide-28
SLIDE 28

Case study: Electronic Toll Pricing

Activity 1: Classify Entities in domains

User domain: components under the control of the user, eg, user devices Service domain: components outside the control of the user, eg, backend system at provider

slide-29
SLIDE 29

Case study: Electronic Toll Pricing

slide-30
SLIDE 30

Case study: Electronic Toll Pricing

slide-31
SLIDE 31

Case study: Electronic Toll Pricing

Trust Service to keep privacy of location data Risk of privacy breach

slide-32
SLIDE 32

Case study: Electronic Toll Pricing

Location is not needed,

  • nly the amount to bill!
slide-33
SLIDE 33

Case study: Electronic Toll Pricing

Location is not needed,

  • nly the amount to bill!
slide-34
SLIDE 34

Case study: Electronic Toll Pricing

Location is not needed,

  • nly the amount to bill!

Service integrity?

slide-35
SLIDE 35

Case study: Electronic Toll Pricing

Location is not needed,

  • nly the amount to bill!

Service integrity?

slide-36
SLIDE 36

Privacy-Preserving Electronic Toll Pricing

Location data Billing data Billing data

Crypto Commitments to:

slide-37
SLIDE 37

Privacy-Preserving Electronic Toll Pricing

Location data Billing data Billing data

Crypto Commitments to:

slide-38
SLIDE 38

Privacy-Preserving Electronic Toll Pricing

Location data Billing data Billing data

Crypto Commitments to:

slide-39
SLIDE 39

Privacy-Preserving Electronic Toll Pricing

Location data Billing data Billing data

+ ZK proofs that prices come from a correct policy Crypto Commitments to:

slide-40
SLIDE 40

Privacy-Preserving Electronic Toll Pricing

Location data Billing data Billing data

Homomorphic Commitments to: + ZK proofs that prices come from a correct policy

slide-41
SLIDE 41

Privacy-Preserving Electronic Toll Pricing

Location data Billing data Billing data

Homomorphic Commitments to: + ZK proofs that prices come from a correct policy

slide-42
SLIDE 42

Privacy-Preserving Electronic Toll Pricing

Location data Billing data Billing data

Homomorphic Commitments to: + ZK proofs that prices come from a correct policy

slide-43
SLIDE 43

Privacy-Preserving Electronic Toll Pricing

Location data Billing data Billing data

Homomorphic Commitments to: + ZK proofs that prices come from a correct policy

slide-44
SLIDE 44

Privacy-Preserving Electronic Toll Pricing

Location data Billing data Billing data

Homomorphic Commitments to: + ZK proofs that prices come from a correct policy

slide-45
SLIDE 45

Privacy-Preserving Electronic Toll Pricing

Homomorphic Commitments to: + ZK proofs that prices come from a correct policy

slide-46
SLIDE 46

Privacy-Preserving Electronic Toll Pricing

Homomorphic Commitments to: + ZK proofs that prices come from a correct policy

slide-47
SLIDE 47

Privacy-Preserving Electronic Toll Pricing

Homomorphic Commitments to: + ZK proofs that prices come from a correct policy

slide-48
SLIDE 48

Case study: Electronic Toll Pricing

Location is not needed,

  • nly the amount to bill!

Service integrity

slide-49
SLIDE 49

Case study: Electronic Toll Pricing

Privacy ENABLING Technologies

slide-50
SLIDE 50

Data protection compliance I want all data Data I can collect

The Usual approach

A change in our way of thinking....

slide-51
SLIDE 51

Data protection compliance I want all data Data I can collect

The Usual approach

A change in our way of thinking....

Data needed for the purpose

The PbD approach

Maintain service integrity Data I will fjnally collect

slide-52
SLIDE 52

The Usual approach

A change in our way of thinking....

slide-53
SLIDE 53

The Usual approach

Other case studies: Privacy-preserving Biometrics

slide-54
SLIDE 54

The Usual approach

Other case studies: Privacy-preserving Biometrics

t( ) t( ) =?

slide-55
SLIDE 55

The Usual approach

t( ) t( ) =?

Templates linkable across databases Reveal clear biometric Not revocable Many times not externalizable

Other case studies: Privacy-preserving Biometrics

slide-56
SLIDE 56

Other case studies: Privacy-preserving Biometrics

The Usual approach

t( ) t( ) =?

Templates linkable across databases Reveal clear biometric Not revocable Many times not externalizable

slide-57
SLIDE 57

Other case studies: Privacy-preserving Biometrics

The Usual approach

t( ) t( ) =?

Templates linkable across databases Reveal clear biometric Not revocable Many times not externalizable

slide-58
SLIDE 58

The Usual approach

in ?

Other case studies: Privacy-preserving Passenger Registry

slide-59
SLIDE 59

The Usual approach

in ?

Surveillance on all passengers

Other case studies: Privacy-preserving Passenger Registry

slide-60
SLIDE 60

The Usual approach

in ?

Surveillance on all passengers

Other case studies: Privacy-preserving Passenger Registry

slide-61
SLIDE 61

PART I: Reasoning about Privacy when designing systems PART II: Evaluating Privacy in Privacy-Preserving systems

slide-62
SLIDE 62

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems PART II: Evaluating Privacy in Privacy-Preserving systems

slide-63
SLIDE 63

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems

WELL ESTABLISHED DESIGN AND EVALUATION METHODS

PART II: Evaluating Privacy in Privacy-Preserving systems

– Private searches – Private billing – Private comparison – Private sharing – Private statistics computation – Private electronic cash – Private genomic computations

  • ...
slide-64
SLIDE 64

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems

but expensive and require expertise

PART II: Evaluating Privacy in Privacy-Preserving systems

WELL ESTABLISHED DESIGN AND EVALUATION METHODS

slide-65
SLIDE 65

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems

cheap but...

PART II: Evaluating Privacy in Privacy-Preserving systems

slide-66
SLIDE 66

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems

cheap but... DIFFICULT TO DESIGN / EVALUATE

PART II: Evaluating Privacy in Privacy-Preserving systems

slide-67
SLIDE 67

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems

cheap but...

PART II: Evaluating Privacy in Privacy-Preserving systems

DIFFICULT TO DESIGN / EVALUATE

slide-68
SLIDE 68

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems

cheap but...

PART II: Evaluating Privacy in Privacy-Preserving systems

DIFFICULT TO DESIGN / EVALUATE

slide-69
SLIDE 69

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems

cheap but...

PART II: Evaluating Privacy in Privacy-Preserving systems

DIFFICULT TO DESIGN / EVALUATE

slide-70
SLIDE 70

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems

cheap but...

PART II: Evaluating Privacy in Privacy-Preserving systems

DIFFICULT TO DESIGN / EVALUATE

slide-71
SLIDE 71

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems

cheap but...

PART II: Evaluating Privacy in Privacy-Preserving systems

DIFFICULT TO DESIGN / EVALUATE

slide-72
SLIDE 72

PRIVACY-PRESERVING SOLUTIONS CRYPTO-BASED VS ANONYMIZATION/OBFUSCATION

PART I: Reasoning about Privacy when designing systems

cheap but...

PART II: Evaluating Privacy in Privacy-Preserving systems

DIFFICULT TO DESIGN / EVALUATE

slide-73
SLIDE 73

Pseudonymity: pseudonymous as ID (personal data!) Anonymity: decoupling identity and action Unlinkability: hiding link between actions Unobservability: hiding the very existence of actions Plausible deniability: not possible to prove a link between identity and action

We need technical objectives – PRIVACY GOALS

“obfuscation”: not possible to recover a real item from a noisy item

slide-74
SLIDE 74

Pseudonymity: pseudonymous as ID (personal data!) Anonymity: decoupling identity and action Unlinkability: hiding link between actions Unobservability: hiding the very existence of actions Plausible deniability: not possible to prove a link between identity and action

We need technical objectives – PRIVACY GOALS

Why is it so difficult t to Evaluate them?

“obfuscation”: not possible to recover a real item from a noisy item

slide-75
SLIDE 75

Let's take one example: Anonymity

  • Art. 29 WP’s opinion on anonymization techniques:

3 criteria to decide a dataset is non-anonymous (pseudonymous): 1) is it still possible to single out an individual 2) is it still possible to link two records within a dataset (or between two datasets) 3) can information be inferred concerning an individual?

http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/fjles/2014/wp216_en.pdf

slide-76
SLIDE 76

1) is it still possible to single out an individual location

“the median size of the individual's anonymity set in the U.S. working population is 1, 21 and 34,980, for locations known at the granularity of a census block, census track and county respectively”

Let's take one example: Anonymity

slide-77
SLIDE 77

1) is it still possible to single out an individual location

“if the location of an individual is specifjed hourly, and with a spatial resolution equal to that given by the carrier’s antennas, four spatio-temporal points are enough to uniquely identify 95% of the individuals.” [15 montsh, 1.5M people]

Let's take one example: Anonymity

slide-78
SLIDE 78

1) is it still possible to single out an individual location web browser

Let's take one example: Anonymity

slide-79
SLIDE 79

1) is it still possible to single out an individual location web browser

“It was found that 87% (216 million of 248 million) of the population in the United States had reported characteristics that likely made them unique based only on {5-digit ZIP, gender, date of birth}”

Let's take one example: Anonymity

slide-80
SLIDE 80

2) Link two records within a dataset (or datasets)

take two graphs representing social networks and map the nodes to each

  • ther based on the graph structure alone

—no usernames, no nothing Netflix Prize, Kaggle contest

social graphs

Let's take one example: Anonymity

slide-81
SLIDE 81

take two graphs representing social networks and map the nodes to each

  • ther based on the graph structure alone

—no usernames, no nothing Netflix Prize, Kaggle contest Technique to automate graph de- anonymization based on machine learning. Does not need to know the algorithm!

social graphs

Let's take one example: Anonymity

slide-82
SLIDE 82

2) Link two records within a dataset (or datasets)

Let's take one example: Anonymity

slide-83
SLIDE 83

2) Link two records within a dataset (or datasets)

Let's take one example: Anonymity

slide-84
SLIDE 84

“Anti-surveillance PETs” technical goals privacy properties: Anonymity

3) infer information about an individual

“Based on GPS tracks from, we identify the latitude and longitude of their homes. From these locations, we used a free Web service to do a reverse “white pages” lookup, which takes a latitude and longitude coordinate as input and gives an address and name. [172 individuals]”

slide-85
SLIDE 85

3) infer information about an individual

“We investigate the subtle cues to user identity that may be exploited in attacks

  • n the privacy of users in web search

query logs. We study the application of simple classifjers to map a sequence of queries into the gender, age, and location

  • f the user issuing the queries.”

Let's take one example: Anonymity

slide-86
SLIDE 86

Data anonymization is a weak privacy mechanism Only to be used when other protections are also applied. (contractual, organizational) Magical thinking! this cannot happen in general!

Let's take one example: Anonymity

slide-87
SLIDE 87

Data anonymization is a weak privacy mechanism Only to be used when other protections are also applied. (contractual, organizational) Impossible to sanitize without severely damaging usefulness Removing PII is not enough! - Any aspect could lead to re-identification Magical thinking! this cannot happen in general!

Let's take one example: Anonymity

slide-88
SLIDE 88

Let's take one example: Anonymity

Data anonymization is a weak privacy mechanism Only to be used when other protections are also applied. (contractual, organizational) Impossible to sanitize without severely damaging usefulness Removing PII is not enough! - Any aspect could lead to re-identification Magical thinking! this cannot happen in general! Risk of de-anonymization? Probabilistic Analysis Pr[identity action | observation ] →

slide-89
SLIDE 89

Privacy evaluation is a Probabilistic analy lysis systematic reasoning to evaluate a mechanism

Anonymity - Pr[identity action | observation ] → Unlinkability - Pr[action A action B | observation ] ↔ Obfuscation - Pr[real action | observed noisy action ]

slide-90
SLIDE 90

Privacy evaluation is a Probabilistic analy lysis systematic reasoning to evaluate a mechanism

Anonymity - Pr[identity action | observation ] → Unlinkability - Pr[action A action B | observation ] ↔ Obfuscation - Pr[real action | observed noisy action ]

slide-91
SLIDE 91

Privacy evaluation is a Probabilistic analy lysis systematic reasoning to evaluate a mechanism

Anonymity - Pr[identity action | observation ] → Unlinkability - Pr[action A action B | observation ] ↔ Obfuscation - Pr[real action | observed noisy action ]

slide-92
SLIDE 92

Privacy evaluation is a Probabilistic analy lysis systematic reasoning to evaluate a mechanism

Anonymity - Pr[identity action | observation ] → Unlinkability - Pr[action A action B | observation ] ↔ Obfuscation - Pr[real action | observed noisy action ]

slide-93
SLIDE 93

Privacy evaluation is a Probabilistic analy lysis systematic reasoning to evaluate a mechanism

Anonymity - Pr[identity action | observation ] → Unlinkability - Pr[action A action B | observation ] ↔ Obfuscation - Pr[real action | observed noisy action ]

slide-94
SLIDE 94

Privacy evaluation is a Probabilistic analy lysis systematic reasoning to evaluate a mechanism

Anonymity - Pr[identity action | observation ] → Unlinkability - Pr[action A action B | observation ] ↔ Obfuscation - Pr[real action | observed noisy action ]

slide-95
SLIDE 95

Privacy evaluation is a Probabilistic analy lysis systematic reasoning to evaluate a mechanism

Anonymity - Pr[identity action | observation ] → Unlinkability - Pr[action A action B | observation ] ↔ Obfuscation - Pr[real action | observed noisy action ]

slide-96
SLIDE 96

Privacy evaluation is a Probabilistic analy lysis systematic reasoning to evaluate a mechanism

Anonymity - Pr[identity action | observation ] → Unlinkability - Pr[action A action B | observation ] ↔ Obfuscation - Pr[real action | observed noisy action ]

slide-97
SLIDE 97

Privacy evaluation is a Probabilistic analy lysis systematic reasoning to evaluate a mechanism

Anonymity - Pr[identity action | observation ] → Unlinkability - Pr[action A action B | observation ] ↔ Obfuscation - Pr[real action | observed noisy action ]

slide-98
SLIDE 98

1) Analytical mechanism inversion Given the description of the system, develop the mathematical expressions that effectively invert the system:

“Inversion”? what do you mean?

slide-99
SLIDE 99

1) Analytical mechanism inversion Given the description of the system, develop the mathematical expressions that effectively invert the system:

“Inversion”? what do you mean?

slide-100
SLIDE 100

1) Analytical mechanism inversion Given the description of the system, develop the mathematical expressions that effectively invert the system:

“Inversion”? what do you mean?

slide-101
SLIDE 101

1) Analytical mechanism inversion Given the description of the system, develop the mathematical expressions that effectively invert the system:

“Inversion”? what do you mean?

slide-102
SLIDE 102

1) Analytical mechanism inversion Given the description of the system, develop the mathematical expressions that effectively invert the system:

“Inversion”? what do you mean?

slide-103
SLIDE 103

1) Analytical mechanism inversion Given the description of the system, develop the mathematical expressions that effectively invert the system:

“Inversion”? what do you mean?

slide-104
SLIDE 104

Take aways

Privacy by design rocks! but realizing it is non-trivial

slide-105
SLIDE 105

Take aways

Privacy by design rocks! but realizing it is non-trivial

PART I: Reasoning about Privacy when designing systems PART II: Evaluating Privacy in Privacy- Preserving systems

Explicit privacy engineering activities Systematic reasoning for privacy evaluation

slide-106
SLIDE 106

Take aways

Privacy by design rocks! but realizing it is non-trivial

PART I: Reasoning about Privacy when designing systems PART II: Evaluating Privacy in Privacy- Preserving systems

Explicit privacy engineering activities Systematic reasoning for privacy evaluation

slide-107
SLIDE 107

Take aways

Privacy by design rocks! but realizing it is non-trivial

PART I: Reasoning about Privacy when designing systems PART II: Evaluating Privacy in Privacy- Preserving systems

Explicit privacy engineering activities privacy evaluation

slide-108
SLIDE 108

Take aways

Privacy by design rocks! but realizing it is non-trivial

PART I: Reasoning about Privacy when designing systems PART II: Evaluating Privacy in Privacy- Preserving systems

Explicit privacy engineering activities privacy evaluation

slide-109
SLIDE 109

thanks!

Any questions?

More about privacy: https://www.petsymposium.org/ http://www.degruyter.com/view/j/popets

slide-110
SLIDE 110

What do we want the data for...? Statistics!

slide-111
SLIDE 111

“Wouldn't it be nice if I could send complex queries to a database to extract statistics, and it returned results that are informative, but leak very little information about any individual?”

What do we want the data for...? Statistics!

slide-112
SLIDE 112

What do we want the data for...? Statistics!

slide-113
SLIDE 113

“Wouldn't it be nice if I could send complex queries to a database to extract statistics, and it returned results that are informative, but leak very little information about any individual?”

What do we want the data for...? Statistics!

Query-based privacy Differential Privacy!

slide-114
SLIDE 114

“Wouldn't it be nice if I could send complex queries to a database to extract statistics, and it returned results that are informative, but leak very little information about any individual?”

What do we want the data for...? Statistics!

Why is that possible (while anonymization was impossible):

Query-based privacy Differential Privacy!

slide-115
SLIDE 115

“Wouldn't it be nice if I could send complex queries to a database to extract statistics, and it returned results that are informative, but leak very little information about any individual?”

What do we want the data for...? Statistics!

Why is that possible (while anonymization was impossible):

The fjnal result depends on multiple personal records However it does not depend much on any particular one (sensitivity)

Query-based privacy Differential Privacy!

slide-116
SLIDE 116

“Wouldn't it be nice if I could send complex queries to a database to extract statistics, and it returned results that are informative, but leak very little information about any individual?”

What do we want the data for...? Statistics!

Why is that possible (while anonymization was impossible):

The fjnal result depends on multiple personal records However it does not depend much on any particular one (sensitivity) Therefore adding a little bit of noise to the result, suffjces to hide any record contribution For full anonymization.... one would need to add a lot of noise to all the entries

Query-based privacy Differential Privacy!

slide-117
SLIDE 117

“Wouldn't it be nice if I could send complex queries to a database to extract statistics, and it returned results that are informative, but leak very little information about any individual?”

What do we want the data for...? Statistics!

Query-based privacy Differential Privacy!

Why is that possible (while anonymization was impossible):

The fjnal result depends on multiple personal records However it does not depend much on any particular one (sensitivity) Therefore adding a little bit of noise to the result, suffjces to hide any record contribution

slide-118
SLIDE 118

“Wouldn't it be nice if I could send complex queries to a database to extract statistics, and it returned results that are informative, but leak very little information about any individual?”

What do we want the data for...? Statistics!

Query-based privacy Differential Privacy!

Why is that possible (while anonymization was impossible):

The fjnal result depends on multiple personal records However it does not depend much on any particular one (sensitivity) Therefore adding a little bit of noise to the result, suffjces to hide any record contribution