PRIVACY IN PERVASIVE COMPUTING COMPUTING Marc Langheinrich ETH - - PDF document

privacy in pervasive computing computing
SMART_READER_LITE
LIVE PREVIEW

PRIVACY IN PERVASIVE COMPUTING COMPUTING Marc Langheinrich ETH - - PDF document

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 PRIVACY IN PERVASIVE COMPUTING COMPUTING Marc Langheinrich ETH Zurich, Switzerland Approaches to Ubicomp Privacy Disappearing Computer Troubadour Project (10/2002 05/2003)


slide-1
SLIDE 1

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 1

PRIVACY IN PERVASIVE COMPUTING COMPUTING

Marc Langheinrich

ETH Zurich, Switzerland

Approaches to Ubicomp Privacy

Disappearing Computer Troubadour Project (10/2002 ‐ 05/2003)

Promote Absence of Protection as User Empowerment Promote Absence of Protection as User Empowerment

“It’s maybe about letting them find their own ways of cheating“

Make it Someone Else’s Problem

“For [my colleague] it is more appropriate to think about

[security and privacy] issues. It’s not really the case in my case“ Insist that “Good Security“ will Fix It

“All you need is really good firewalls“

Conclude it is Incompatible with Ubiquitous Computing

“I think you can’t think of privacy... it’s impossible, because if I do

it, I have troubles with finding [a] Ubicomp future“

Marc Langheinrich: The DC‐Privacy Troubadour – Assessing Privacy Implications of DC‐Projects. Designing for Privacy Workshop. DC Tales Conference, Santorini, Greece, June 2003. 2

slide-2
SLIDE 2

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 2

Today‘s Menu

Understanding Privacy

Definitions

Technical Approaches

Challenges

Definitions

  • 1. Public policy
  • 2. Laws and regulations
  • 3. Interpersonal aspects

Challenges

  • 1. Location privacy
  • 2. RFID privacy
  • 3. Smart environments

3

UNDERSTANDING PRIVACY

Privacy in Pervasive Computing

UNDERSTANDING PRIVACY

slide-3
SLIDE 3

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 3

What Is Privacy?

“The right to be let alone.“

The right to be let alone.

Warren and Brandeis, 1890

(Harvard Law Review) “Numerous mechanical

devices threaten to make good the prediction that ’ h t i hi d i th ’what is whispered in the closet shall be proclaimed from the housetops’“

Louis D. Brandeis, 1856 ‐ 1941

5

Technological Revolution, 1888

6

slide-4
SLIDE 4

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 4

Information Privacy

“The desire of people to choose The desire of people to choose

freely under what circumstances and to what extent they will expose themselves, their attitude and their behavior to others.“

Alan Westin, 1967

Privacy And Freedom, Atheneum

  • Dr. Alan F. Westin

7

1 PRIVACY AS PUBLIC POLICY

Privacy in Pervasive Computing

  • 1. PRIVACY AS PUBLIC POLICY
slide-5
SLIDE 5

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 5

Why Privacy?

“A free and democratic society requires respect A free and democratic society requires respect

for the autonomy of individuals, and limits on the power of both state and private

  • rganizations to intrude on that autonomy…

privacy is a key value which underpins human dignity and other key values such as freedom of association and freedom of speech…“

Preamble To Australian Privacy Charter, 1994

“All this secrecy is making life harder, more

expensive, dangerous and less serendipitous“

Peter Cochrane, Former Head Of BT Research

“You have no privacy anyway, get over it“

Scott McNealy, CEO Sun Microsystems, 1995

9

Informational Self‐Determination

“Informationelle Selbstbestimmung“

“If one cannot with sufficient surety be aware of

If one cannot with sufficient surety be aware of the personal information about him that is known in certain part of his social environment, . . . can be seriously inhibited in his freedom of self‐determined planning and

  • deciding. A society in which the individual citizen would not be able

to find out who knows what when about them, would not be reconcilable with the right of self‐determination over personal data. Those who are unsure if differing attitudes and actions are ubiquitously noted and permanently stored, processed, or di t ib t d ill t t t t d t ith th i b h i Thi distributed, will try not to stand out with their behavior. . . . This would not only limit the chances for individual development, but also affect public welfare, since self‐determination is an essential requirement for a democratic society that is built on the participatory powers of its citizens.“

German Federal Constitutional Court (Census Decision ’83)

10

slide-6
SLIDE 6

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 6

Informational Self‐Determination

“Informationelle Selbstbestimmung“

“The problem is the possibility of The problem is the possibility of

technology taking on a life of its

  • wn, so that the actuality and

inevitability of technology creates a

  • dictatorship. Not a dictatorship of

people over people with the help of technology, but a dictatorship of technology over people.“

Ernst Benda (1983) Federal Constitutional Court Chief Justice

Ernst Benda, *1925 Chief Justice 1971‐1983

11

2 PRIVACY LAW PRIMER

Privacy in Pervasive Computing

  • 2. PRIVACY LAW PRIMER
slide-7
SLIDE 7

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 7

Privacy Law History

Justices Of The Peace Act (England 1361) Justices Of The Peace Act (England, 1361)

Sentences for Eavesdropping and Peeping Toms

„The poorest man may in his cottage bid defiance to

all the force of the crown. It may be frail; its roof may shake; … – but the king of England cannot enter; all his forces dare not cross the threshold of the ruined tenement“

William Pitt the Elder (1708‐1778)

First Modern Privacy Law in the German State Hesse, 1970

13

Fair Information Principles (FIP)

Drawn up by the OECD 1980 Drawn up by the OECD, 1980

“Organisation for economic cooperation and development“ Voluntary guidelines for member states Goal: Ease transborder flow of goods (and information!)

Five Principles (simplified) 1 Openness 4 Collection Limitation Core principles of modern privacy laws world‐wide

  • 1. Openness
  • 2. Data access and control
  • 3. Data security
  • 4. Collection Limitation
  • 5. Data subject’s consent

14

slide-8
SLIDE 8

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 8

Laws and Regulations

Privacy laws and regulations vary widely Privacy laws and regulations vary widely

throughout the world

US has mostly sector‐specific laws, with relatively minimal

protections

Self‐Regulation favored over comprehensive Privacy Laws Fear that regulation hinders e‐commerce

E h l f d i l

Europe has long favored strong privacy laws

Often single framework for both public & private sector Privacy commissions in each country (some countries have

national and state commissions)

15

EU Privacy Law

Data Protection Directive 1995/46/EC Data Protection Directive 1995/46/EC

Sets a Benchmark For National Law For Processing Personal

Information In Electronic And Manual Files

Follows OECD Fair Information Practices Facilitates Data‐flow Between Member States And Restricts

Export Of Personal Data To „Unsafe“ Non‐EU Countries “E‐Privacy“ Directive 2002/58/EC (“amends“ 95/46/EC)

Provisions for “public electronic communications services“

Data Retention Directive 2006/24/EC

Orders storage of “traffic data“ for law enforcement

16

slide-9
SLIDE 9

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 9

Safe Harbor

How to Make US a “Safe“ Country (in terms of the Directive) How to Make US a Safe Country (in terms of the Directive)

US companies self‐certify adherence to requirements

  • Dept. of Commerce maintains list (1429 as of 04/08)

http://www.export.gov/safeharbor/

Signatories must provide

notice of data collected, purposes, and recipients choice of opt‐out of 3rd‐party transfers, opt‐in for sensitive data access rights to delete or edit inaccurate information

access rights to delete or edit inaccurate information

security for storage of collected data enforcement mechanisms for individual complaints

Approved July 26, 2000 by EU

reserves right to renegotiate if remedies for EU citizens prove to

be inadequate

17

3 INTERPERSONAL PRIVACY

Privacy in Pervasive Computing

  • 3. INTERPERSONAL PRIVACY
slide-10
SLIDE 10

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 10

Privacy Invasions

When Do We Feel that Our Privacy Has Been Violated? When Do We Feel that Our Privacy Has Been Violated?

Perceived privacy violations due to crossing of “privacy

borders“ Privacy Boundaries

  • 1. Natural
  • 2. Social
  • 2. Social
  • 3. Spatial / temporal
  • 4. Transitory

Gary T. Marx MIT

19

Privacy Borders (Marx)

Natural Natural

Physical limitations (doors, sealed letters)

Social

Group confidentiality (doctors, colleagues)

Spatial / Temporal

Family vs work adolescence vs midlife Family vs. work, adolescence vs. midlife

Transitory

Fleeting moments, unreflected utterances

20

slide-11
SLIDE 11

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 11

Privacy Regulation Theory

Privacy as Accessibility Optimization: Inputs and Outputs Privacy as Accessibility Optimization: Inputs and Outputs

Spectrum: “Openness“/ “Closedness“ Contrasts with privacy as withdrawal (“to be let alone“) Privacy not monotonic: “More“ is not always “better“

Dynamic Boundary Negotiation Process

Neither static nor rule based Neither static nor rule‐based Requires fine‐grained coordination of

action & disclosure

Focus on public spaces, mediated by

spatial environment

Irwin Altman University of Utah

21

Managing Privacy Boundaries

Use Altman‘s Theory for Networked Environments Use Altman s Theory for Networked Environments

Very different from real‐world public spaces!

Disclosure Boundary: Private and Public

We sometimes need publicity to limit accessibility

Identity Boundary: Self and Other

Acting according to status, group, affiliation

Leysia Palen

  • Univ. of Colorado

Disclosure according to recipient’s identity & role Disclosure as means to differentiate or associate

Temporality Boundary: Past, Present, and Future

Effects of temporal sequence of disclosures

Paul Dourish UC Irvine

22

slide-12
SLIDE 12

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 12

Today‘s Menu

Understanding Privacy

Definitions

Technical Approaches

Challenges

Definitions

  • 1. Public policy
  • 2. Laws and regulations
  • 3. Interpersonal aspects

Challenges

  • 1. Location privacy
  • 2. RFID privacy
  • 3. Smart environments

23

TECHNICAL APPROACHES

Privacy in Pervasive Computing

TECHNICAL APPROACHES

slide-13
SLIDE 13

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 13

The Information Society

“More transactions will tend to be recorded the More transactions will tend to be recorded, the

records will tend to be kept longer, information will tend to be given to more people, more data will tend to be transmitted over public communi‐ cation channels, fewer people will know what is happening to the data, the data will tend to be more easily accessible and data can be manipulated combined corre

Paul Sieghart

Portrait by Paul Benny

accessible, and data can be manipulated, combined, corre‐ lated, associated and analysed to yield information which could not have been obtained without the use of computers“

Paul Sieghart: Privacy and Computers. London, Latimer, 1976, pp. 75‐76

25

Ubicomp Privacy Implications

Data Collection (“more transactions“) Data Collection ( more transactions )

Scale (everywhere, anytime) Manner (inconspicuous, invisible) Motivation (context!)

Data Types (“not without computers“)

Observational instead of factual data Observational instead of factual data

Data Access (“more easily accessible“)

“The Internet of Things“

26

slide-14
SLIDE 14

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 14

FIP Challenges in Ubicomp

How to inform subjects about data collections? How to inform subjects about data collections?

Unintrusive but noticeable

How to provide access to stored data?

Who has it? How much of this is “my data“?

How to ensure confidentiality, and authenticity?

Without alienating user!

How to minimize data collection?

What part of the “context“ is relevant?

How to obtain consent from data subjects?

Missing UIs? Do people understand implications?

27

Border Crossings in Ubicomp

Smart appliances (natural borders) Smart appliances (natural borders)

“Spy“ on you in your own home

Family intercom (social borders)

Grandma knows when you’re home

Consumer profiles (temporal borders)

Span time & space Span time & space

“Memory amplifier“ (transitory borders)

Records careless utterances

28

slide-15
SLIDE 15

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 15

1 LOCATION PRIVACY

Privacy in Pervasive Computing

  • 1. LOCATION PRIVACY

Location Privacy

the ability to prevent other parties from learning one’s

… the ability to prevent other parties from learning one s

current or past location.“ (Beresford and Stajano, 2003)

Why Share Your Location?

By‐product of positioning technology (e.g., cell towers) Required to use service (recommendations, toll roads, ...)

Useful Definition?! Think Altman!

Let others (friends, family) know where I am

Why NOT to Share Your Location?

Current & past location reveal activities, interests, identity

30

slide-16
SLIDE 16

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 16

Location Privacy Technology

Many Proposals Many Proposals

Laws/regulations and audits (enterprise privacy) Anonymization (“k‐anonymity“) Obfuscation Rule‐based access control

Privacy Model?

John Krumm

Microsoft Research

Privacy Model?

Assumption: Less location disclosure means more privacy

(Krumm, 2008) Provides Overview of State‐of‐the‐Art

31

Location Obfuscation

Adding noise, pertubation, dummy traffic to location data

Protects against attackers, but degrades service use (Krumm, 2007) showed that LOTS of obfuscation is needed Typically combined with rules to selectively adjust accuracy

Image Source: Krumm, J., Inference Attacks on Location Tracks, in Fifth International Conference

  • n Pervasive Computing (Pervasive 2007). 2007: Toronto, Ontario Canada. p. 127‐143.

32

slide-17
SLIDE 17

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 17

Location Mix Zones

Frequently Change Pseudonyms Frequently Change Pseudonyms

to Prevent Tracking

Change often trivial to detect

Idea: Designate “Mix Zones“

With No Tracking / LBS Active

Change pseudonyms only

Alastair Beresford Cambridge Univ. Frank Stajano Cambridge Univ.

Change pseudonyms only within mix zone

(Beresford and Stajano, 2003)

  • ffer probabilistic model for

unlinkability in mix zones

33

2 RFID PRIVACY

Privacy in Pervasive Computing

  • 2. RFID PRIVACY
slide-18
SLIDE 18

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 18

RFID Privacy Concerns

35

Why RFID Privacy?

Embarrassment

P t

Embarrassment

Whig? Underwear? Medicine?

Criminal Acts

Theft, assault, murder, terror

Whig

Model #2342 Material: Polyester

Tiger Tanga Passport

Name: John Doe Nationality: USA Visa for: Israel Ari Juels, RSA Laboratories

Wallet

Cash: 370 Euro Student ID: #2845/ETH Maker: Aldi (Suisse) Last washed: 5 days ago

Viagra

Maker: Pfitzer Size: Maxi (60 pills) Original “RFID‐Man“ Artwork (c) 2006 A

36

slide-19
SLIDE 19

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 19

Why RFID Privacy?

Embarrassment Embarrassment

Whig? Underwear? Medicine?

Criminal Acts

Theft, assault, murder, terror

Indirect Control

Subtle influence through consumer profiles Subtle influence through consumer profiles

Direct Control

“Technology Paternalism“, government surveillance

Spiekermann, Pallas: Technology Paternalism – Wider Implications of Ubiquitous Computing. Poiesis and Praxis: International Journal of Technology Assessment and Ethics of Science. Springer, Jan 2006, pp.1–13

37

RFID Privacy Approaches

Tag Deactivation Tag Deactivation

Fry, cut, or silence (software) Prevents further use

Tag Encryption (Lots!)

More expensive tags Password management!

Readout Interference (“Blocker‐Tag“ “Guardian“)

Kill‐Station

METRO Future Store

Readout Interference ( Blocker‐Tag , Guardian )

Reliability? Feasibility? Legal? Burdens user (conscious use, configuration)

(Juels, 2006) Provides Overview of State‐Of‐The‐Art

See also (Langheinrich, 2008) or (Spiekermann, 2008)

Ari Juels

RSA Laboratories 38

slide-20
SLIDE 20

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 20

Shamir Tags: “Keyless“ Encryption

Idea: Encrypted Tag Carries Its Own Key Idea: Encrypted Tag Carries Its Own Key

No need to manage keys!

Prevent Skimming: Key Readout Takes Long Time

Bitwise release, short range (e.g., one bit/sec) Intermediate results meaningless, since encrypted

Prevent Tracking: Reply With Random Bits

(This Speaker)

Decryption requires all bits being read

Allow Known Tags to be Directly Identified

Allows owner to use tags without apparent restrictions Initial bit‐release enough for instant identification from known set

Source: Langheinrich, Marti: Practical Minimalist Cryptography for RFID Privacy. IEEE Systems Journal, Vol. 1, No. 2, 2007

Remo Marti

Ergon Informatik 39

3 SMART ENVIRONMENTS

Privacy in Pervasive Computing

  • 3. SMART ENVIRONMENTS
slide-21
SLIDE 21

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 21

Smart Environments

Privacy Middleware Privacy Middleware

Machine‐readable privacy policies con‐

trol data collection, processing, access

Personal device (e.g., mobile phone) to

monitor and configure environment

Optional: Built‐in data obfuscation

Aware Home

Georgia Tech

Example Projects

PawS/P3P (Langheinrich, 2003) Confab toolkit (Hong and Landay, 2004)

James Landay

  • Univ. of Washington

Jason Hong

CMU 42

Presence Technology

Providing Control and Awareness to Users

enberg, and Boyle, 2006)

Providing Control and Awareness to Users

Who is seeing what information about me?

CSCW / Telecommuting

(Bellotti and Sellen, 1993) – EuroPARC’s RAVE media space (Neustaedter Greenberg and Boyle 2006)

Blurring?

Image Source: (Neustaedter, Gree

(Neustaedter, Greenberg, and Boyle, 2006) – Blurring?

Location Disclosure

(Hong and Landay, 2004) – Lemming: Location‐enhanced IM (Consolvo et al., 2005) – Social relations and loc. disclosure

43

slide-22
SLIDE 22

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 22

Related Issues

Privacy and Usability Privacy and Usability

CUPS group @ CMU

Hippocratic Databases

Privacy‐compliant processing

Statistical Databases

Anonymization in databases

Rakesh Agrawal

Microsoft Research

Lorrie F. Cranor

CMU

Anonymization in databases

(“k‐anonymity“) Economics of Privacy

When do people share data?

Latanya Sweeney

CMU

Alessandro Acquisti

CMU 44

SUMMARY & OUTLOOK

Privacy in Pervasive Computing

SUMMARY & OUTLOOK

slide-23
SLIDE 23

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 23

Take Home Message

Privacy is Not Just Secrecy and Seclusion! Privacy is Not Just Secrecy and Seclusion!

Privacy is a process, not a state Solution requires good understanding of social,

legal, and policy issues involved Pervasive Computing Offers New Challenges

Invisible, comprehensive, sensor‐based, …

b ( ) h ll

Ubicomp (Privacy) Challenges

User interface (notice, choice, consent) Protocols (anonymity, security, access) Social compatibility (privacy boundaries)

46

Some Techno Fallacies

The Objectivity Of Numbers

Gary T. Marx MIT

The Objectivity Of Numbers Data Means Knowledge More Data Means More Knowledge If It Is In The Computer, It Must Be Right If You Have Nothing To Hide, There’s No Danger

Irwin Altman University of Utah

Less Data Means More Privacy Technology Is Neither Good Nor Bad. Nor Is It

Neutral

Melvin C. Kranzberg

See, e.g., Gary Marx: Rocky Bottoms and Some Information Age Techno‐Fallacies. International Political Sociology, Vol. 1, No. 1. March 2007, pp. 83‐110. Melvin C. Kranzberg Georgia Tech (1917‐1995) 47

slide-24
SLIDE 24

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 24

Thank You For Your Attention

Understanding Privacy

Definitions

Technical Approaches

Challenges

Definitions

  • 1. Public policy
  • 2. Laws and regulations
  • 3. Interpersonal aspects

Challenges

  • 1. Location privacy
  • 2. RFID privacy
  • 3. Smart environments

48

General Reading

David Brin: The Transparent Society David Brin: The Transparent Society.

Perseus Publishing, 1999

Simson Garfinkel: Database Nation –

The Death of Privacy in the 21st

  • Century. O’Reilly, 2001

Lawrence Lessig: Code and Other

Laws of Cyberspace. Basic Books, 2006 http://codev2.cc/

49

slide-25
SLIDE 25

Marc Langheinrich: Privacy in Pervasive Computing May 22, 2008 Tutorial at Pervasive 2008, Sydney, Australia 25

Privacy Law

Rotenberg The Privacy Law Rotenberg: The Privacy Law

Sourcebook 2004. EPIC, 2004

Privacy & Human Rights 2006.

EPIC

Solove, Rotenberg, Schwartz:

I f ti P i L 2 d Information Privacy Law. 2nd edition, Aspen, 2005

50

Privacy and Technology

Deborah Estrin (ed ): Embedded Every Deborah Estrin (ed.): Embedded, Every‐

where: A Research Agenda for Networked Systems of Embedded Computers. National Academies Press, 2001.

http://www.nap.edu/openbook.php?isbn=0309075688

Waldo, Lin, Millett (eds.): Engaging Privacy

and Information Technology in a Digital gy g

  • Age. National Academies Press, 2007.

Wright, Gutwirth, Friedewald, et al.:

Safeguards in a World of Ambient

  • Intelligence. Springer, 2008

51