PRIVACY IN PERVASIVE COMPUTING Marc Langheinrich University of - - PowerPoint PPT Presentation

privacy in pervasive computing
SMART_READER_LITE
LIVE PREVIEW

PRIVACY IN PERVASIVE COMPUTING Marc Langheinrich University of - - PowerPoint PPT Presentation

PRIVACY IN PERVASIVE COMPUTING Marc Langheinrich University of Lugano (USI), Switzerland Approaches to Ubicomp Privacy Disappearing Computer Troubadour Project (10/2002 05/2003) Promote Absence of Protection as User Empowerment


slide-1
SLIDE 1

PRIVACY IN PERVASIVE COMPUTING

Marc Langheinrich

University of Lugano (USI), Switzerland

slide-2
SLIDE 2

Approaches to Ubicomp Privacy

Disappearing Computer Troubadour Project (10/2002 ‐ 05/2003)

Promote Absence of Protection as User Empowerment

“It’s maybe about letting them find their own ways of cheating“

Make it Someone Else’s Problem

“For [my colleague] it is more appropriate to think about

[security and privacy] issues. It’s not really the case in my case“

Insist that “Good Security“ will Fix It

“All you need is really good firewalls“

Conclude it is Incompatible with Ubiquitous Computing

“I think you can’t think of privacy... it’s impossible, because if I do

it, I have troubles with finding [a] Ubicomp future“

Marc Langheinrich: The DC‐Privacy Troubadour – Assessing Privacy Implications of DC‐Projects. Designing for Privacy Workshop. DC Tales Conference, Santorini, Greece, June 2003.

4

slide-3
SLIDE 3

Today‘s Menu

Understanding Privacy

Definitions

  • 1. Public policy
  • 2. Laws and regulations
  • 3. Interpersonal aspects

Technical Approaches

Challenges

  • 1. Location privacy
  • 2. RFID privacy
  • 3. Smart environments

5

slide-4
SLIDE 4

UNDERSTANDING PRIVACY

Privacy in Pervasive Computing

slide-5
SLIDE 5

What Is Privacy?

“The right to be let alone.“

Warren and Brandeis, 1890

(Harvard Law Review)

“Numerous mechanical

devices threaten to make good the prediction that ’what is whispered in the closet shall be proclaimed from the housetops’“

Louis D. Brandeis, 1856 ‐ 1941

7

slide-6
SLIDE 6

Technological Revolution, 1888

8

slide-7
SLIDE 7

Information Privacy

“The desire of people to choose

freely under what circumstances and to what extent they will expose themselves, their attitude and their behavior to others.“

Alan Westin, 1967

Privacy And Freedom, Atheneum

  • Dr. Alan F. Westin

9

slide-8
SLIDE 8
  • 1. PRIVACY AS PUBLIC POLICY

Privacy in Pervasive Computing

slide-9
SLIDE 9

Why Privacy?

“A free and democratic society requires respect

for the autonomy of individuals, and limits on the power of both state and private

  • rganizations to intrude on that autonomy…

privacy is a key value which underpins human dignity and other key values such as freedom of association and freedom of speech…“

Preamble To Australian Privacy Charter, 1994

“All this secrecy is making life harder, more

expensive, dangerous and less serendipitous“

Peter Cochrane, Former Head Of BT Research

“You have no privacy anyway, get over it“

Scott McNealy, CEO Sun Microsystems, 1995

11

slide-10
SLIDE 10

The NTHNTF‐Argument

„If you’ve got nothing to hide,

you’ve got nothing to fear”

UK Gov’t Campaign Slogan for CCTV (1994)

Assumption

Privacy is (mostly) about hiding (evil/bad/unethical) secrets

Implications

Privacy protects wrongdoers (terrorists, child molesters, …) No danger for law‐abiding citizens Society overall better off without it!

12

slide-11
SLIDE 11

Informational Self‐Determination

“Informationelle Selbstbestimmung“

“If one cannot with sufficient surety be aware of

the personal information about him that is known in certain part of his social environment, . . . can be seriously inhibited in his freedom of self‐determined planning and

  • deciding. A society in which the individual citizen would not be able

to find out who knows what when about them, would not be reconcilable with the right of self‐determination over personal data. Those who are unsure if differing attitudes and actions are ubiquitously noted and permanently stored, processed, or distributed, will try not to stand out with their behavior. . . . This would not only limit the chances for individual development, but also affect public welfare, since self‐determination is an essential requirement for a democratic society that is built on the participatory powers of its citizens.“

German Federal Constitutional Court (Census Decision ’83)

13

slide-12
SLIDE 12

Informational Self‐Determination

“Informationelle Selbstbestimmung“

“The problem is the possibility of

technology taking on a life of its

  • wn, so that the actuality and

inevitability of technology creates a

  • dictatorship. Not a dictatorship of

people over people with the help of technology, but a dictatorship of technology over people.“

Ernst Benda (1983) Federal Constitutional Court Chief Justice

Ernst Benda, *1925 Chief Justice 1971‐1983

15

slide-13
SLIDE 13

Issue: Profiles

Allow Inferences About You

May or may not be true (re. AOLStalker)!

May Categorize You

High spender, music afficinado, credit risk

May Offer Or Deny Services

Rebates, different prices, priviliged access

„Social Sorting“ (Lyons, 2003)

Opaque decisions „channel“ life choices

Image Sources: http://www.jimmyjanesays.com/sketchblog/paperdollmask_large.jpg and http://www.queensjournal.ca/story/2008‐03‐14/supplement/keeping‐tabs‐personal‐data/

slide-14
SLIDE 14

Not Orwell, But Kafka!

17

slide-15
SLIDE 15
  • 2. PRIVACY LAW PRIMER

Privacy in Pervasive Computing

slide-16
SLIDE 16

Privacy Law History

Justices Of The Peace Act (England, 1361)

Sentences for Eavesdropping and Peeping Toms

„The poorest man may in his cottage bid defiance to

all the force of the crown. It may be frail; its roof may shake; … – but the king of England cannot enter; all his forces dare not cross the threshold of the ruined tenement“

William Pitt the Elder (1708‐1778)

First Modern Privacy Law in the German State Hesse, 1970

19

slide-17
SLIDE 17

Fair Information Principles (FIP)

Drawn up by the OECD, 1980

“Organisation for economic cooperation and development“ Voluntary guidelines for member states Goal: Ease transborder flow of goods (and information!)

Five Principles (simplified) Core principles of modern privacy laws world‐wide

  • 1. Openness
  • 2. Data access and control
  • 3. Data security
  • 4. Collection Limitation
  • 5. Data subject’s consent

20

slide-18
SLIDE 18

Laws and Regulations

Privacy laws and regulations vary widely

throughout the world

US has mostly sector‐specific laws, with relatively minimal

protections

Self‐Regulation favored over comprehensive Privacy Laws Fear that regulation hinders e‐commerce

Europe has long favored strong privacy laws

Often single framework for both public & private sector Privacy commissions in each country (some countries have

national and state commissions)

21

slide-19
SLIDE 19

EU Privacy Law

EU Data Protection Directive 1995/46/EC

Sets A benchmark for national law for processing personal

information in electronic and manual files

Expands on OECD Fair Information Practices: no automated ad‐

verse decisions, minimality, retention, sensitive data, checks, …

Facilitates data‐flow between Member States and restricts

export of personal data to „unsafe“ non‐eu countries

“E‐Privacy“ Directive 2002/58/EC (“amends“ 95/46/EC)

Provisions for “public electronic communications services“

Data Retention Directive 2006/24/EC

Orders storage of “traffic data“ for law enforcement

22

slide-20
SLIDE 20

US‐EU: Safe Harbor

How to Make US a “Safe“ Country (in terms of the Directive)

US companies self‐certify adherence to requirements

  • Dept. of Commerce maintains list (1790 as of 04/09)

http://www.export.gov/safeharbor/

Signatories must provide

notice of data collected, purposes, and recipients choice of opt‐out of 3rd‐party transfers, opt‐in for sensitive data access rights to delete or edit inaccurate information security for storage of collected data enforcement mechanisms for individual complaints

Approved July 26, 2000 by EU (w/ right to renegotiate)

So far, not a single dispute!

23

slide-21
SLIDE 21

APEC Privacy Framework 2004

APEC – Asia Pacific Economic Group

21 Member States, e.g., Japan, South Korea, PR China, Hong

Kong, Philipines, Australia, New Zealand, Macau, U.S., Canada

APEC „agreements“ non‐binding, only public commitment

Defines Nine „APEC Privacy Principles“

Typically less strict than EU and even OECD principles, e.g., no

purpose specification, no prior notice, use of “harm principle”

No details or checks on national implementation No attempt at EU Data Directive 95/46/EC compliance No consideration of existing privacy laws in region (see in italics)

24

See also: (Kennedy et al., 2009), (Greenleaf, 2009)

slide-22
SLIDE 22
  • 3. INTERPERSONAL PRIVACY

Privacy in Pervasive Computing

slide-23
SLIDE 23

Privacy Invasions

When Do We Feel that Our Privacy Has Been Violated?

Perceived privacy violations due to crossing of “privacy

borders“

Privacy Boundaries

  • 1. Natural
  • 2. Social
  • 3. Spatial / temporal
  • 4. Transitory

Gary T. Marx MIT

27

slide-24
SLIDE 24

Privacy Borders (Marx)

Natural

Physical limitations (doors, sealed letters)

Social

Group confidentiality (doctors, colleagues)

Spatial / Temporal

Family vs. work, adolescence vs. midlife

Transitory

Fleeting moments, unreflected utterances

28

slide-25
SLIDE 25

Privacy Regulation Theory

Privacy as Accessibility Optimization: Inputs and Outputs

Spectrum: “Openness“/ “Closedness“ Contrasts with privacy as withdrawal (“to be let alone“) Privacy not monotonic: “More“ is not always “better“

Dynamic Boundary Negotiation Process

Neither static nor rule‐based Requires fine‐grained coordination of

action & disclosure

Focus on public spaces, mediated by

spatial environment

Irwin Altman University of Utah

29

slide-26
SLIDE 26

Managing Privacy Boundaries

Use Altman‘s Theory for Networked Environments

Very different from real‐world public spaces!

Disclosure Boundary: Private and Public

We sometimes use publicity to limit accessibility

Identity Boundary: Self and Other

Acting according to status, group, affiliation Disclosure according to recipient’s identity & role Disclosure as means to differentiate or associate

Temporality Boundary: Past, Present, and Future

Effects of temporal sequence of disclosures

Leysia Palen

  • Univ. of Colorado

Paul Dourish UC Irvine

30

  • L. Palen, P. Dourish: “Unpacking "privacy" for a networked world.” Proceedings of CHI 2003. pp.129‐136.
slide-27
SLIDE 27

Today‘s Menu

Understanding Privacy

Definitions

  • 1. Public policy
  • 2. Laws and regulations
  • 3. Interpersonal aspects

Technical Approaches

Challenges

  • 1. Location privacy
  • 2. RFID privacy
  • 3. Smart environments

31

slide-28
SLIDE 28

TECHNICAL APPROACHES

Privacy in Pervasive Computing

slide-29
SLIDE 29

Ubicomp Privacy Implications

Data Collection (“more transactions“)

Scale (everywhere, anytime) Manner (inconspicuous, invisible) Motivation (context!)

Data Types (“not without computers“)

Observational instead of factual data

Data Access (“more easily accessible“)

“The Internet of Things“

34

slide-30
SLIDE 30

FIP Challenges in Ubicomp

How to inform subjects about data collections?

Unintrusive but noticeable

How to provide access to stored data?

Who has it? How much of this is “my data“?

How to ensure confidentiality, and authenticity?

Without alienating user!

How to minimize data collection?

What part of the “context“ is relevant?

How to obtain consent from data subjects?

Missing UIs? Do people understand implications?

35

slide-31
SLIDE 31

Border Crossings in Ubicomp

Smart appliances (natural borders)

“Spy“ on you in your own home

Family intercom (social borders)

Grandma knows when you’re home

Consumer profiles (temporal borders)

Span time & space

“Memory amplifier“ (transitory borders)

Records careless utterances

36

slide-32
SLIDE 32
  • 1. LOCATION PRIVACY

Privacy in Pervasive Computing

slide-33
SLIDE 33

Location Privacy

“… the ability to prevent other parties from learning one’s

current or past location.“ (Beresford and Stajano, 2003)

Why Share Your Location?

By‐product of positioning technology (e.g., cell towers) Required to use service (recommendations, toll roads, ...) Let others (friends, family) know where I am

Why NOT to Share Your Location?

Location profiles reveal/imply activities, interests, identity

Useful Definition?! Think Altman!

38

slide-34
SLIDE 34

Location Privacy Technology

Many Proposals

Laws/regulations and audits (enterprise privacy) Anonymization (“k‐anonymity“) Obfuscation Rule‐based access control

Privacy Model?

Assumption: Less location disclosure means more privacy

(Krumm, 2008) Provides Overview of State‐of‐the‐Art

John Krumm

Microsoft Research 39

slide-35
SLIDE 35

Location Obfuscation

Adding noise, pertubation, dummy traffic to location data

Protects against attackers, but degrades service use (Krumm, 2007) showed that LOTS of obfuscation is needed Typically combined with rules to selectively adjust accuracy

Image Source: Krumm, J., Inference Attacks on Location Tracks, in Fifth International Conference

  • n Pervasive Computing (Pervasive 2007). 2007: Toronto, Ontario Canada. p. 127‐143.

40

slide-36
SLIDE 36

Location Mix Zones

Frequently Change Pseudonyms

to Prevent Tracking

Change often trivial to detect

Idea: Designate “Mix Zones“

With No Tracking / LBS Active

Change pseudonyms only

within mix zone

(Beresford and Stajano, 2003)

  • ffer probabilistic model for

unlinkability in mix zones

Alastair Beresford Cambridge Univ. Frank Stajano Cambridge Univ.

41

slide-37
SLIDE 37
  • 2. RFID PRIVACY

Privacy in Pervasive Computing

slide-38
SLIDE 38

RFID Privacy Concerns

43

slide-39
SLIDE 39

Why RFID Privacy?

Embarrassment

Whig? Underwear? Medicine?

Criminal Acts

Theft, assault, murder, terror

Whig

Model #2342 Material: Polyester

Wallet

Cash: 370 Euro Student ID: #2845/ETH

Tiger Tanga

Maker: Aldi (Suisse) Last washed: 5 days ago

Viagra

Maker: Pfitzer Size: Maxi (60 pills)

Passport

Name: John Doe Nationality: USA Visa for: Israel

Original “RFID‐Man“ Artwork (c) 2006 Ari Juels, RSA Laboratories

44

slide-40
SLIDE 40

Why RFID Privacy?

Embarrassment

Whig? Underwear? Medicine?

Criminal Acts

Theft, assault, murder, terror

Indirect Control

Subtle influence through consumer profiles

Direct Control

“Technology Paternalism“, government surveillance

Spiekermann, Pallas: Technology Paternalism – Wider Implications of Ubiquitous Computing. Poiesis and Praxis: International Journal of Technology Assessment and Ethics of Science. Springer, Jan 2006, pp.1–13

45

slide-41
SLIDE 41

RFID Privacy Approaches

Tag Deactivation

Fry, cut, or silence (software) Prevents further use

Tag Encryption (Lots!)

More expensive tags Password management!

Readout Interference (“Blocker‐Tag“, “Guardian“)

Reliability? Feasibility? Legal? Burdens user (conscious use, configuration)

(Juels, 2006) Provides Overview of State‐Of‐The‐Art

See also (Langheinrich, 2008) or (Spiekermann, 2008)

Ari Juels

RSA Laboratories

Kill‐Station

METRO Future Store

46

slide-42
SLIDE 42

Shamir Tags: “Keyless“ Encryption

Idea: Encrypted Tag Carries Its Own Key

No need to manage keys!

Prevent Skimming: Key Readout Takes Long Time

Bitwise release, short range (e.g., one bit/sec) Intermediate results meaningless, since encrypted

Prevent Tracking: Reply With Random Bits

Decryption requires all bits being read

Allow Known Tags to be Directly Identified

Allows owner to use tags without apparent restrictions Initial bit‐release enough for instant identification from known set

Source: Langheinrich, Marti: Practical Minimalist Cryptography for RFID Privacy. IEEE Systems Journal, Vol. 1, No. 2, 2007

Remo Marti

Ergon Informatik

(This Speaker)

47

slide-43
SLIDE 43

011010111…1101 Secret s 111000011…101101 101101101…110111 101010011…101101 Shares hi

96‐bit EPC‐Code 106‐bit Shamir Share

111000011101010001010111010101101010100…1010101110101 Shamir Tag

318‐bit Shamir Tag

Bit Disclosure Over Time

10‐bit x‐value 96‐bit y‐value

111000011101010001010111010101101010100…1010101110101 Initial Reply

16‐bit Reply

111000011101010001010111010101101010100…1010101110101

+1 bit +1 bit

111000011101010001010111010101101010100…1010101110101 111000011101010001010111010101101010100…1010101110101

+1 bit

111000011101010001010111010101101010100…1010101110101

+1 bit

111000011101010001010111010101101010100…1010101110101

+1 bit

Unknown tags will eventually be identified Instant identification

  • f known items

48

slide-44
SLIDE 44
  • 3. SMART ENVIRONMENTS

Privacy in Pervasive Computing

slide-45
SLIDE 45

Smart Environments

Privacy Middleware

Machine‐readable privacy policies con‐

trol data collection, processing, access

Personal device (e.g., mobile phone) to

monitor and configure environment

Optional: Built‐in data obfuscation

Example Projects

PawS/P3P (Langheinrich, 2003) Confab toolkit (Hong and Landay, 2004)

James Landay

  • Univ. of Washington

Jason Hong

CMU

Aware Home

Georgia Tech 50

slide-46
SLIDE 46

Presence Technology

Providing Control and Awareness to Users

Who is seeing what information about me?

CSCW / Telecommuting

(Bellotti and Sellen, 1993) – EuroPARC’s RAVE media space (Neustaedter, Greenberg, and Boyle, 2006) – Blurring?

Location Disclosure

(Hong and Landay, 2004) – Lemming: Location‐enhanced IM (Consolvo et al., 2005) – Social relations and loc. disclosure

Image Source: (Neustaedter, Greenberg, and Boyle, 2006) 51

slide-47
SLIDE 47

Related Issues

Privacy and Usability

CUPS group @ CMU

Hippocratic Databases

Privacy‐compliant processing

Statistical Databases

Anonymization in databases

(“k‐anonymity“)

Economics of Privacy

When do people share data?

Rakesh Agrawal

Microsoft Research

Lorrie F. Cranor

CMU

Latanya Sweeney

CMU

Alessandro Acquisti

CMU 52

slide-48
SLIDE 48

SUMMARY & OUTLOOK

Privacy in Pervasive Computing

slide-49
SLIDE 49

Take Home Message

Privacy is Not Just Secrecy and Seclusion!

Privacy is a process, not a state Solution requires good understanding of social,

legal, and policy issues involved

Pervasive Computing Offers New Challenges

Invisible, comprehensive, sensor‐based, …

Ubicomp (Privacy) Challenges

User interface (notice, choice, consent) Protocols (anonymity, security, access) Social compatibility (privacy boundaries)

54

slide-50
SLIDE 50

Some Techno Fallacies

The Objectivity Of Numbers Data Means Knowledge More Data Means More Knowledge If It Is In The Computer, It Must Be Right If You Have Nothing To Hide, There’s No Danger Less Data Means More Privacy Technology Is Neither Good Nor Bad. Nor Is It

Neutral

Melvin C. Kranzberg

See, e.g., Gary Marx: Rocky Bottoms and Some Information Age Techno‐Fallacies. Intl. Political Sociology, Vol. 1, No. 1. March 2007, pp. 83‐110.

Irwin Altman University of Utah Melvin C. Kranzberg Georgia Tech (1917‐1995) Gary T. Marx MIT 55

slide-51
SLIDE 51

Thank You For Your Attention

Understanding Privacy

Definitions

  • 1. Public policy
  • 2. Laws and regulations
  • 3. Interpersonal aspects

Technical Approaches

Challenges

  • 1. Location privacy
  • 2. RFID privacy
  • 3. Smart environments

56

slide-52
SLIDE 52

General Reading

David Brin: The Transparent Society.

Perseus Publishing, 1999

Simson Garfinkel: Database Nation –

The Death of Privacy in the 21st

  • Century. O’Reilly, 2001

Lawrence Lessig: Code and Other

Laws of Cyberspace. Basic Books, 2006 http://codev2.cc/

57

slide-53
SLIDE 53

Privacy Law

Rotenberg: The Privacy Law

Sourcebook 2004. EPIC, 2004

Privacy & Human Rights 2006.

EPIC

Solove, Schwartz: Information

Privacy Law. 3rd edition, Aspen, 2009

58

slide-54
SLIDE 54

Privacy and Technology

Deborah Estrin (ed.): Embedded, Every‐

where: A Research Agenda for Networked Systems of Embedded Computers. National Academies Press, 2001.

http://www.nap.edu/openbook.php?isbn=0309075688

Waldo, Lin, Millett (eds.): Engaging Privacy

and Information Technology in a Digital

  • Age. National Academies Press, 2007.

Wright, Gutwirth, Friedewald, et al.:

Safeguards in a World of Ambient

  • Intelligence. Springer, 2008

59

slide-55
SLIDE 55