Privacy & PETs Simone Fischer-Hbner SWITS PhD course, 2012 1 st - - PowerPoint PPT Presentation

privacy pets
SMART_READER_LITE
LIVE PREVIEW

Privacy & PETs Simone Fischer-Hbner SWITS PhD course, 2012 1 st - - PowerPoint PPT Presentation

Privacy & PETs Simone Fischer-Hbner SWITS PhD course, 2012 1 st Session, 3rd May 2012, KTH Overview I. Privacy - Definition II. EU Directives & Basic Privacy Principles III. Privacy Issues (LBS, Social Networks, RFID...) IV.


slide-1
SLIDE 1

Privacy & PETs

Simone Fischer-Hübner

SWITS PhD course, 2012 1st Session, 3rd May 2012, KTH

slide-2
SLIDE 2

Overview

I. Privacy - Definition II. EU Directives & Basic Privacy Principles

  • III. Privacy Issues (LBS, Social Networks,

RFID...)

  • IV. Introduction to PETs, Terminology

V. Mix-nets

slide-3
SLIDE 3
  • I. Definition

Warren & Brandeis 1890

“The right to be let alone”

slide-4
SLIDE 4

Definition- Alan Westin 1967

“Privacy is the claim of individuals, groups and institutions to determine for themselves, when, how and to what extent information about them is communicated to others”

slide-5
SLIDE 5

Privacy Dimensions

 Informational self-

determination

 Spatial privacy

slide-6
SLIDE 6
  • II. EU Directives

EU Data Protection Directive 95/46/EC

Objective:

Protection of fundamental rights, freedom of individuals

Harmonsation of privacy legislation in Europe

Scope (Art. 3): applies to the processing of personal data wholly or partly by automatic means, and to the processing otherwise than by automatic means of personal data which form part of a filing system.

Personal data: any information relating to an identified or identifiable natural person ('data subject')

Does not apply for data processing for

 defense, public/state security, criminal law enforcement  purely private or household activity (”household exemption”)

slide-7
SLIDE 7

Basic Privacy principles

implemented in EU-Directive 95/46/EC

 Legitimisation by law, informed consent

(Art. 7 EU Directive)

  • Data minimisation and avoidance (Art. 6

I c,e)

  • Data must be adequate, relevant, not excessive

& anonymised as soon as possible

 Purpose specification and purpose

binding (Art. 6 I b)

  • ”Non-sensitive” data do not exist !
slide-8
SLIDE 8

Example for Purpose Misuse

 Lidl Video Monitoring Scandal

slide-9
SLIDE 9

Basic privacy principles (II)

  • No processing of ”special categories of

data” (Art. 8)

 Transparency, rights of data subjects

 to be informed (Art.10)  to be notified, if data have not been obtained

from the data subject (Art.11)

 of access to data (Art.12 a)  of correction of incorrect data / erasure or

blocking of illegally stored data (Art.12b)

 to object to direct marketing (Art.14)

slide-10
SLIDE 10

Basic privacy principles (III)

  • Requirement of security mechanisms

(Art.17)

  • Sanctions (Art.24)

 Restricted personal data transfer from

EU to third countries (Art. 25)

slide-11
SLIDE 11

Basic privacy principles (IV)

  • Supervision (Art. 28): Supervisory authorities
  • monitor compliance
  • act upon complaints
  • be consulted when drawing up data protection

regulations

  • draw up regularly reports
slide-12
SLIDE 12

Policy is not directly accessible and website did actually not exist! Purpose not well specified Is it necessary to publish photos to the whole world (instead of having restricted access for parents, students,

  • etc. )?

Privacy Principles in Practice

slide-13
SLIDE 13

EU Directive 2002/58/EC on privacy and electronic communications

  • Confidentiality of communications

(Art.5):

 No interception/surveillance without the data

subject’s consent

 Protection against cookies, spyware, web-

bugs (“right to refuse”)

slide-14
SLIDE 14

EU Directive 2002/58/EC on privacy and electronic communications (cont.)

  • Traffic data (Art.6):

 Must be erased or made anonymous upon

completion of transmission

 Processing for billing purposes permissible  Processing for the purposes of value added

services/marketing with the consent of the subscriber/user

slide-15
SLIDE 15

EU Directive 2002/58/EC on privacy and electronic communications (cont.)

  • Location data other than Traffic data

(Art.9):

 May only be processed when made anonymous, or

with the informed consent of the user/subscriber

 Where consent has been obtained, the user/subscriber

must still have possibility of temporarily refusing the processing of location data

Problem: Also Location Data within Traffic Data can be very sensitive

slide-16
SLIDE 16

EU Directive 2002/58/EC on privacy and electronic communications (cont.)

  • Unsolicited communications (Art.13):

Opt-in system for electronic mail for direct marketing (so-called “spam”) Problem: US American CAN-SPAM Act of 2003 requires only Opt-out system, no SPAM legislation in most countries

slide-17
SLIDE 17

Data Retention according to EU

Directives 2002/58/EC and 2006/24/EC

 Art.15 of EU-Directive 2002/58/EC:

 allows member states to adopt laws for data retention for

safeguarding security, defence, law enforcement

 Data Retention Directive 2006/24/EC:

 Requires telco companies to retain traffic and location data for 6-24

months

Problems/Questions:

 Appropriate ?

 Threat to online privacy: Traffic data contains mainly ”fingerprints”

  • f non-criminal users

 Criminals find ways ”around”

 Will anonymisation service providers be forced to collect

more data than they would normally collect ?

slide-18
SLIDE 18

New e-Privacy Directive, 2009/136/EC amending Directive 2002/58/EC

 Enacted on 18 Dec 2009, to be

implemented by June 2011

 Main changes:

 Privacy Breach Notification  Requirement to implement a security

policy, adopt measures to restrict access to personal data, and to protect against data breaches

 More strict SPAM legislation  Consent for the placement of cookies

slide-19
SLIDE 19

Newly proposed EU Data Protection Rules

(Data Protection Regulation proposed 25 January 2012)

Single set of data protection rules, valid across the EU, and if data are processed abroad by companies active in the EU

  • market. One DPA in charge.

”Right to be forgotten”

Right to ”data portability”

Easier exercising of data subject rights (electronically, in relation to all recipients)

Explicitly given consent, more transparency of data handling, easy-to-understand policies

Increased accountability, privacy breach notification, higher penalites (up to 2% of global annual turnover)

Privacy impact assessment (PIA)

Privacy by Design (PbD), Privacy by Default

slide-20
SLIDE 20
  • III. Privacy Issues

 Global networks, cookies, webbugs, spyware,...  Location-based Services (LBS)  Ambient Intelligence, RFID...  Cloud Computing  Social Networks  Smart Grids  Video Surveillance

slide-21
SLIDE 21

LBS - Risks

 Unsolicited tracking of user’s

position, movements

 Unsolicited Profiling  Disclosure of the user’s

current context

 Disclosure of social networks

Source: Lother Fritsch & Rannenberg, GUF

slide-22
SLIDE 22

Smart Grids

Picture source: Wikipedia

slide-23
SLIDE 23

Smart Metering – Privacy Risks

Each electrical appliance has its

  • wn fingerprint

Provides information about when someone is at home, cooks, watches TV, takes a shower, etc.

Allows real-time surveillance

Of interest for burglars, insurance companies, law enforcement,…

Source: Smart Metering & Privacy, Elias Leake Quinn, 2009

slide-24
SLIDE 24

1500 Euros in wallet

Serial numbers: 597387,389473 …

Wig

model #4456

(cheap polyester)

30 items

  • f lingerie

Das Kapital and Communist- party handbook Replacement hip

medical part #459382

The RFID consumer privacy problem

Here’s

  • Mr. Jones

in 2020…

Source:Ari Juels, RSA Laboratories

slide-25
SLIDE 25

Wig

serial #A817TS8

…and the tracking problem

  • Mr. Jones pays with a credit card; his RFID tags now linked to

his identity

  • Mr. Jones attends a political rally; law enforcement scans his

RFID tags

  • Mr. Jones wins Turing Award; physically tracked by paparazzi

via RFID

slide-26
SLIDE 26

Privacy Risks of Social Networks

 Intimate personal

details about social contacts, personal life, etc.

 The Internet never

forgets completely....

 Not only accessible

by ”friends”

slide-27
SLIDE 27

Freddi Staur (ID fraudster)

slide-28
SLIDE 28

Identity Theft – ”Face Rape”

slide-29
SLIDE 29

Privacy Risks of Social Networks – Social Network Analysis

Social Network Analysis/Profiling by:

  • Employers
  • Schools/Universities
  • Tax authorities
  • Law Enforcement
  • Insurances
  • Hackers
  • …..
slide-30
SLIDE 30

Art.29 Data Protection Working Party –

Opinion 5/2009 on online social networking

Who is the data controller?

SNS providers

Users ?

 No: if ”household exemption” applies  Yes: 

If SNS is used beyond a purely personal/houshold activity (e.g., as a collaboration platform for a company)

When access to profile information extends beyond self-selected ”friends” (e.g., access is given to all SNS members) – unless exemptions apply for journalistic purposes

What are obligations of data controllers?

Appropriate technical and organisational security measures

 SNS should offer privacy-friendly default settings

Informed consent by other individual concerned

Information to be provided by SNS

 Information about the SNS identity, purposes (Art.10 EU Directive)  SNS users should be advised by SNS to obtain informed consent before

uploading information/pictures about others

slide-31
SLIDE 31
  • IV. Introduction to Privacy-Enhancing

Technologies (PETs)

 Law alone is not sufficient for protecting

privacy in our Network Society

 PETs needed for implementing Law  PETs for empowering users to exercise

their rights

slide-32
SLIDE 32

Classifications of PETs

  • 1. PETs for minimizing/ avoiding personal data

(-> Art. 6 I c., e. EU Directive 95/46/EC) (providing Anonymity, Pseudonymity, Unobservability, Unlinkability)

  • At communication level:
  • Mix nets, Onion Routing, TOR
  • DC nets
  • Crowds,…
  • At application level:
  • Anonymous Ecash
  • Private Information Retrieval
  • Anonymous Credentials,…
  • 2. PETs for the safeguarding of lawful processing

(-> Art. 17 EU Directive 95/46/EC)

  • P3P, Privacy policy languages
  • Encryption,…
  • 3. Combination of 1 & 2
  • Privacy-enhancing Identity Management (PRIME, PrimeLife)
slide-33
SLIDE 33

Definitions - Anonymity

 Anonymity: The state of being not

identifiable within a set of subjects (e.g. set

  • f senders or recipients), the anonymity set

Source: Pfitzmann/Hansen

slide-34
SLIDE 34

Definitions - Unobservability

 Unobservability ensures that a user may use a

resource or service without others being able to

  • bserve that the resource or service is being

used

Source: Pfitzmann/Hansen

slide-35
SLIDE 35

Definitions - Unlinkability

 Unlinkability of two or more items (e.g., subjects,

messages, events):

 Within the system, from the attacker’s perspective,

these items are no more or less related after the attacker’s observation than they were before

 Unlinkability of sender and recipient (relationship

anonymity):

 It is untraceable who is communicating with whom

slide-36
SLIDE 36

Definitions - Pseudonymity

 Pseudonymity is the

use of pseudonyms as IDs

 Pseudonymity allows

to provide both privacy protection and accountability

Person pseudonym Role pseudonym Relationship pseudonym Role-relationship pseudonym Transaction pseudonym L I N K A B I L I T Y

Source: Pfitzmann/Hansen

A N O N Y M I T Y y

slide-37
SLIDE 37

Definitions - Pseudonymity (cont.)

Source: Pfitzmann/Hansen

slide-38
SLIDE 38
  • V. Anonymous Communication

Technologies – Mix-nets

Alice Bob

But now the remailer knows everything!

slide-39
SLIDE 39

Bob, r3, msg

Mix-nets (Chaum, 1981)

Alice Bob A3, r2 Bob, r3, msg K3 K2 Bob, r3, msg K3 msg Ki: public key of Mixi, ri: random number, Ai: address of Mixi K3 A3, r2 A2, r1 K2 K1 Mix 1 Mix 2 Mix 3

slide-40
SLIDE 40

Functionality of a Mix Server (Mixi)

Input Message Mi

Ignore repeated messages Buffering messages in batch Sufficient messages from many senders ? Recode *) Reorder

Output Message Mi+1 to Mixi+1

Message DB

M I X i

*) decrypts Mi = EKi[Ai+1, ri, Mi+1] with the private key of Mixi, ignores random number ri,

  • btains address Ai+1 and encrypted Mi+1
slide-41
SLIDE 41

Functionality of a Mix Server (Mixi)

Input Message Mi

Ignore repeated messages Buffering messages in a batch Sufficient messages from many senders ? Recode *) Reorder

Output Message Mi+1 to Mixi+1

Message DB

M I X i

*) decrypts Mi = EKi[Ai+1, ri, Mi+1] with the private key of Mixi, ignores random number ri,

  • btains address Ai+1 and encrypted Mi+1

Prevents replay attacks

slide-42
SLIDE 42

Functionality of a Mix Server (Mixi)

Input Message Mi

Ignore repeated messages Buffering messages in batch Sufficient messages from many senders ? Recode *) Reorder

Output Message Mi+1 to Mixi+1

Message DB

M I X i

*) decrypts Mi = EKi[Ai+1, ri, Mi+1] with the private key of Mixi, ignores random number ri,

  • btains address Ai+1 and encrypted Mi+1

Prevents timing correlations

slide-43
SLIDE 43

Functionality of a Mix Server (Mixi)

Input Message Mi

Ignore repeated messages Buffering messages in batch Sufficient messages from many senders ? Recode *) Reorder

Output Message Mi+1 to Mixi+1

Message DB

M I X i

*) decrypts Mi = EKi[Ai+1, ri, Mi+1] with the private key of Mixi, ignores random number ri,

  • btains address Ai+1 and encrypted Mi+1

Prevents content correlations

slide-44
SLIDE 44

Why are random numbers needed ?

If no random number ri is used :

Mixi

E Ki(M, Ai+1 ) M E Ki (M, Ai+1)

= ?

Mixi+1

Address(Mixi+1) = Ai+1

slide-45
SLIDE 45

Why are random numbers needed ?

Mixi

E Ki(M, Ai+1 ) M

Mixi+1

ri

slide-46
SLIDE 46

Sender Anonymity with Mix-nets

Sender (Alice) chooses Mix-Sequence Mix1, ….., Mixn, Mixn+1. Mixn+1 = recipient (Bob). Ai (i =1..n+1): address of Mixi ki (i=1..n+1): public key of Mixi zi: random bit strings M: message for recipient Mi: message that Mixi will receive Sender prepares her message: Mn+1 = EKn+1 (M) Mi = Eki (zi, Ai+1, Mi+1) for i=1…n and sends M1 to Mix1

slide-47
SLIDE 47

Sender Anonymity with Mix-nets (cont.)

Mix1 Mix2 Mix3

Each Mixi decrypts: Eki(zi, Ai+1, Mi+1) -> Ai+1: address of next Mix Mi+1: Eki+1(zi+1, Ai+2, Mi+2), encoded message for Mixi+1, zi: random string, to be discarded and forwards Mi+1 to Mixi+1

Sender (Alice)

Ek1(z1, A2, M2)

Ekn+1(M) Recipient (Bob)

slide-48
SLIDE 48

Recipient Anonymity with Mix-nets

Mix1 Mix2 Mixm Recipient Bob chooses Mix-Sequence Mix1, ….., Mixm. Mix0 = Sender Alice. and creates anonymous return address RA: Rm+1 = e Rj = Ekj(cj, Aj+1, Rj+1) for j=1..m RA = (c0, A1, R1) e : label of return address cj: symmetric key, used by Mixj to encode message on the return path Aj (j =0..m): address of Mixj kj (j=1..m): public key of Mixj zj: random bit strings Recipient Bob sends RA anonymously to Sender Alice:

Ekm(zm, Am-1,Ekm-1(…Ek1(z1,A0,RA)..))

RA

Bob Sender Alice

slide-49
SLIDE 49

Recipient anonymity with Mix- nets (cont.)

Mix1 Mix2 Mix3 Sender Alice replies (without knowing recipient Bob):

Each Mixj receives: cj-1(…c0(M)..), Rj, decrypts: Rj = Ekj(cj, Aj+1, Rj+1) -> (cj, Aj+1, Rj+1), forwards: cj(cj-1(…c0(M)…)), Rj+1 to Mixj+1 Label e indicates Bob which c0,..,cm he has to use to decrypt M

c0(M), R1

Recipient Bob cm(cm-1(…c0(M)…)),e Alice has received anonymous return address RA = (c0, A1, R1)

slide-50
SLIDE 50

Two-Way Anonymous Conversation

slide-51
SLIDE 51

Protection properties & Attacker Model for Mix nets

 Protection properties:

 Sender anonymity against recipients  Recipient anonymity against senders  Unlinkability of sender and recipient

 Attacker may:

 Observe all communication lines  Send own messages  Delay messages  Operate Mix servers (all but one...)

 Attacker cannot:

 Break cryptographic operations  Attack the user’s personal machine

slide-52
SLIDE 52

Questions?

slide-53
SLIDE 53

Backup Slides

(in case that there will be time left)

slide-54
SLIDE 54

Length-preserving Coding

(for preventing message tracing by decreasing sizes)

Messages are sent through Mix sequence Mix1,…, Mixm. Each message has fixed length

  • f b blocks.

Creation of return address: Rm+1 = [e] ( [ ] = block limits) Rj = [ kj (cj, Aj+1)], cj(Rj+1) j=1,..,m e: label , ci: symmetric keys, ki: public keys, di: private keys

  • f Mixi

Each Mixj decrypts first block kj(cj, Aj+1) -> cj, Aj+1, deletes first block, encrypts rest of Mj with cj, inserts Zj before message blocks, forwards Mj+1 to Mix j+1

Figure according to Pfitzmann

slide-55
SLIDE 55

Length-preserving Coding providing Sender Anonymity

Recipient does not know symmetric keys c1,..,cm

  • > Sender has to encrypt message with all ci and to

create R1 Sender creates H1MC1 with (MC: message content) H1 = R1 MC1 = c1(c2…(cm(km+1(MC)))..) km+1: public key of recipient Each Mixi decrypts message blocks with ci

slide-56
SLIDE 56

Lehgth-preserving Coding providing Recipient Anonymity

Sender does not know symmetric keys c1,..,cm Sender receives RA = (c0, A1, R1), encrypts MC with c0, and thus creates H1MC1 with H1 = R1 MC1 = c0(MC) Each Mixi encrypts message blocks with ci