LOCATION PRIVACY Marc Langheinrich University of Lugano (USI), - - PowerPoint PPT Presentation

location privacy
SMART_READER_LITE
LIVE PREVIEW

LOCATION PRIVACY Marc Langheinrich University of Lugano (USI), - - PowerPoint PPT Presentation

LOCATION PRIVACY Marc Langheinrich University of Lugano (USI), Switzerland Zurich (2.5h) Milano (1h) Genoa (2.5h) Securing a Mobile Phone Securing a Mobile Phone Securing a Mobile Phone Securing a Mobile Phone Can We Have it Both Ways?


slide-1
SLIDE 1

LOCATION PRIVACY

Marc Langheinrich

University of Lugano (USI), Switzerland

slide-2
SLIDE 2

Zurich (2.5h) Milano (1h) Genoa (2.5h)

slide-3
SLIDE 3

Securing a Mobile Phone

slide-4
SLIDE 4

Securing a Mobile Phone

slide-5
SLIDE 5

Securing a Mobile Phone

slide-6
SLIDE 6

Securing a Mobile Phone

slide-7
SLIDE 7

Can We Have it Both Ways?

  • Safe
  • Secure
  • Privacy-friendly
  • Usable
  • Useful
  • Used
slide-8
SLIDE 8

WHAT IS PRIVACY?

slide-9
SLIDE 9

Facets of Privacy

slide-10
SLIDE 10

Hard To Define

“Privacy is a value so complex, so entangled in competing and contradictory dimensions, so engorged with various and distinct meanings, that I sometimes despair whether it can be usefully addressed at all.”

Robert C. Post, Three Concepts of Privacy, 89 Georgetown Law Journal 2087 (2001).

Original Slide from Lorrie Cranor: „ 8-533 / 8-733 / 19-608 / 95-818: Privacy Policy, Law, and Technology”, Fall 2008, CMU

  • Prof. Robert C. Post

Yale Law School

slide-11
SLIDE 11

Hard To Analyze

„... The notion of privacy is fraught with multiple meanings, interpretations, and value judgments. … Nearly every thread of analysis leads to other questions and issues that also cry

  • ut for additional analysis—one might even regard

the subject as fractal, where each level of analysis requires another equally complex level of analysis to explore the issues that the previous level raises.”

James Waldo et al., Engaging Privacy and Information Technology in a Digital Age. National Academies Press, 2008

slide-12
SLIDE 12

A Privacy Definition

  • “The right to be let alone.“

– Warren and Brandeis, 1890 (Harvard Law Review)

  • “Numerous mechanical

devices threaten to make good the prediction that ’what is whispered in the closet shall be proclaimed from the housetops’“

Image source: http://historyofprivacy.net/RPIntro3-2009.htm

slide-13
SLIDE 13

George Eastman 1854-1932

Technological Revolution, 1888

Image Source: Wikipedia; Encyclopedia Britannica (Student Edition)

slide-14
SLIDE 14

TomTom iPhone

The Location Revolution, 2009

Rakon GPS Infineon XPOSYS GPS Trackstick 2 Hitachi Clarion Nokia N97

slide-15
SLIDE 15

A Privacy Definition

  • “The right to be let alone.“

– Warren and Brandeis, 1890 (Harvard Law Review)

  • “Numerous mechanical

devices threaten to make good the prediction that ’what is whispered in the closet shall be proclaimed from the housetops’“

Image source: http://historyofprivacy.net/RPIntro3-2009.htm

slide-16
SLIDE 16

SOLITUDE

Facets of Privacy

slide-17
SLIDE 17

Information Privacy

  • “The desire of people to

choose freely under what circumstances and to what extent they will expose themselves, their attitude and their behavior to others.“

– Alan Westin, 1967 Privacy And Freedom, Atheneum

  • Dr. Alan F. Westin
slide-18
SLIDE 18

CONTROL

Facets of Privacy

slide-19
SLIDE 19

Privacy Regulation Theory

  • Privacy as Accessibility Optimization:

Inputs and Outputs

– Not monotonic: “More“ is not always “better“ – Spectrum: “Openness“/ “Closedness“ – Privacy levels: isolation > desired > crowding

  • Dynamic Boundary Negotiation Process

– Neither static nor rule-based – Privacy as a social interaction process – Cultural, territorial, verbal mechanisms

Irwin Altman University of Utah

See, e.g., L. Palen, P. Dourish: “Unpacking "privacy" for a networked world.” Proceedings of CHI 2003. pp.129-136.

slide-20
SLIDE 20

INTIMACY

Facets of Privacy

slide-21
SLIDE 21

Privacy – More Than Secrecy!

Privacy

Secrecy Solitude Control Intimacy Dignity Freedom Anonymity Safety

slide-22
SLIDE 22

WHY LOCATION PRIVACY?

slide-23
SLIDE 23

„Location“ Privacy?

What‘s so special about „location“ that it is worth inventing a special category for it?

slide-24
SLIDE 24

Location Privacy

  • “… the ability to prevent other

parties from learning one’s current or past location.“

(Beresford and Stajano, 2003)

  • „It‘s not about where you are...

It‘s where you have been!“

  • Gary Gale, Head of UK Engineering

for Yahoo! Geo Technologies

Useful Definition?! Think Altman!

Alastair Beresford Cambridge Univ. Frank Stajano Cambridge Univ. Gary Gale Yahoo! UK

slide-25
SLIDE 25

Motivating Disclosure

  • Why Share Your Location?

– By-product of positioning technology (e.g., cell towers, WiFi, ...) – Required to use service (local recommendations, automated payment for toll roads, ...) – Social benefits (let friends and family know where I am, finding new friends, ...)

  • Why NOT to Share Your Location?

– Location profiles reveal/imply activities, interests, identity

slide-26
SLIDE 26

Location Implications

  • Places I Go

– Where I Live / Work – Who I Am (Name) – Hobbies/Interests/Memberships

  • People I Meet

– My Social Network

  • Profiling, e.g.,

– ZIP-Code: implies income, ethnicity, family size

slide-27
SLIDE 27

Implications: Profiles

  • Allow Inferences About You

– May or may not be true!

  • May Categorize You

– High spender, music afficinado, credit risk

  • May Offer Or Deny Services

– Rebates, different prices, priviliged access

  • „Social Sorting“ (Lyons, 2003)

– Opaque decisions „channel“ life choices

Image Sources: http://www.jimmyjanesays.com/sketchblog/paperdollmask_large.jpg http://www.queensjournal.ca/story/2008-03-14/supplement/keeping-tabs-personal-data/

slide-28
SLIDE 28

www.nytimes.com/1992/09/12/technology/orwellian-dream-come-true-a-badge-that-pinpoints-you.html

slide-29
SLIDE 29

Proxy-Based Location Privacy

Active Badge System (1992)

Mike Spreitzer and Marvin Theimer. Providing location information in a ubiquitous computing environment. In Proc. of the 14th ACM Symp. on Operating Systems Principles (SOSP ’93), pp. 270–283. ACM Press, 1993.

Bob‘s User Agent Location Updates Location Query Service Query Interface 7829 7829 (pseudonym) Where‘s Bob? Bob Bob?

slide-30
SLIDE 30

Location Triangle

Who Where When

slide-31
SLIDE 31

What To Protect Against

  • Protect against unwanted/accidential

disclosure (friend finder services/Latitude)

– Immediate disclosure vs. later „lookups“

  • Protect against monitoring (nosy employer)

– Monitoring breaks, work efficiency

  • Protect against commercial profiling

– Excerting subtle influence over decisions

  • Against law enforcement

– If you got nothing to hide, you got nothing to fear?

slide-32
SLIDE 32

Do People Care?

Danezis, George, Lewis, Stephen, Anderson, Ross: How Much is Location Privacy Worth. Fourth Workshop on the Economics of Information Security, Harvard University (2005)

slide-33
SLIDE 33

End-User Attitudes Towards LBS

  • Clear value proposition
  • Simple and appropriate control and

feedback

  • Plausible deniability
  • Limited retention of data
  • Decentralized control
  • Special exceptions for emergencies

Jason Hong: An Architecture for Privacy-Sensitive Ubiquitous Computing. PhD Thesis, Univ. of Califronia Berkeley, 2005. Available at www.cs.cmu.edu/~jasonh/publications/jihdiss.pdf

Jason Hong

CMU

slide-34
SLIDE 34

LOCATION PRIVACY TECHNOLOGY

slide-35
SLIDE 35

Location Privacy Technology

  • Transparency Tools

–Privacy Policies –Rule-based access control

  • Opacity Tools

–Anonymization (“k-anonymity“) –Obfuscation

slide-36
SLIDE 36

TRANSPARENCY TOOLS

slide-37
SLIDE 37

GEOPRIV

  • “A suite of protocols that

allow applications to represent and transmit geographic and civic location information about resources and entities, and to allow users to express policies

  • n how these repre-

sentations are exposed and used”

  • GEOPRIV Model

– Defines how services should use location – Includes privacy controls (Rule Holder) – Location is published to Location Server – Location is used by Location Recipient

  • Defines XML Formats

– Location Objects (GML) – Preference Rules

http://tools.ietf.org/wg/geopriv/

slide-38
SLIDE 38

GEOPRIV Model

Dawson, Martin; James Winterbottom, Martin Thomson (2006-11-13). IP Location. McGraw-Hill. ISBN 0-07-226377-6.

Device Target Location Generator Location Server Rule Holder Rule Maker Location Recipients Location Recipients Location Recipients

slide-39
SLIDE 39

GEOPRIV Example

[Restaurant Finder]

Dawson, Martin; James Winterbottom, Martin Thomson (2006-11-13). IP Location. McGraw-Hill. ISBN 0-07-226377-6.

Target Location Generator Location Server Rule Holder Rule Maker Location Recipients Device Bob Bob‘s GPS-Enable Phone Bob is at 43.5723 S, 153.21760 E Restaurant Finder

slide-40
SLIDE 40

Location Privacy User Interfaces (UIs)

Lederer, Hong, Dey, Landay, Personal Privacy through Understanding and Action: Five Pitfalls for Designers. Personal and Ubiquitous Computing, Vol. 8, no. 6, Nov. 2004, pp. 440-454

slide-41
SLIDE 41

Hong, J. I. and Landay, J. A. 2004. An architecture for privacy- sensitive ubiquitous computing. In Proc. 2nd intl Conf. on Mobile Systems, Applications, and Services (MobiSys '04). ACM, pp. 177-189

Example Confab/Lemming

  • Configuration during use
  • Built-in Plausible Deniability
slide-42
SLIDE 42

Support for Continuous Services

slide-43
SLIDE 43

Example UI: Google Latitude

  • Reciprocal sharing with

individual contacts

  • Individual adjustments

(hide, or only city level)

  • Global (temporal)

adjustments

– Manual override – Disable

slide-44
SLIDE 44

OPACITY TOOLS

slide-45
SLIDE 45

Location Anonymity

[Naïve Approach]

  • Use random IDs that change periodically

– Trivial to trace

slide-46
SLIDE 46

Plan B: Strong Pseudonyms

[Won‘t work either]

slide-47
SLIDE 47

Why Pseudonyms Don‘t Work

  • Observation Identification (OI) Attack

– Correlate single identifiable observation with location pseudonym – ATM use @ location -> Name for pseudonym

slide-48
SLIDE 48

Observation Identifcation Attack

slide-49
SLIDE 49

Observation Identifcation Attack

slide-50
SLIDE 50

Observation Identifcation Attack

slide-51
SLIDE 51

Why Pseudonyms Don‘t Work

  • Observation Identification (OI) Attack

– Correlate single identifiable observation with location pseudonym – ATM use @ location -> Name for pseudonym

  • Restricted Space Identification (RSI) Attack

– Using known mapping from place to name – Home location -> Home address -> Name (Phonebook)

slide-52
SLIDE 52

Pseudonymous User Trace

Img src: [Bereseford, Stajano 2003]

slide-53
SLIDE 53

Location Mix Zones

[Countering RSI Attacks]

  • Address Restricted Space

Identification Attacks

– How to change pseudonyms?

  • Idea: Designate “Mix Zones“

With No Tracking / LBS Active

– Change pseudonyms only within mix zone – (Beresford and Stajano, 2003)

  • ffer probabilistic model for

unlinkability in mix zones

Alastair Beresford Cambridge Univ. Frank Stajano Cambridge Univ.

slide-54
SLIDE 54

K-Anonymity

[Countering OI Attacks]

  • Concept from statistical DBs
  • Challenge: How do you publicly release a

database without compromising individual privacy?

  • Problem: Anonymized data still subject to

„observation attack“ (i.e., linking), allows identification

slide-55
SLIDE 55

K-Anonymity – The Problem

DOB Sex Zipcode Disease 1/21/76 Female 53715 Heart Disease 4/13/86 Male 53715 Hepatitis 2/28/76 Female 53703 Brochitis 1/21/76 Female 53703 Broken Arm 4/13/86 Male 53706 Swine Flu 2/28/76 Male 53706 Common Flu Name DOB Sex Zipcode Alice 1/21/76 Female 53715 Bob 1/10/81 Male 55410 Charlie 10/1/44 Male 90210 Dave 2/21/84 Male 02174 Ellen 4/19/72 Female 02237

Hospital Patient Data Vote Registration Data

Samarati P and Sweeney L.: Protecting privacy when disclosing information: k-anonymity and its enforcement through generalization and suppression, Tech Report SRI-CSL-98-04, 1998

Pierangela Samarati

  • Univ. of Milan

Latanya Sweeney CMU

The triple (DOB, gender, zip code) suffices to uniquely identify at least 87% of US citizens in publicly available databases (1990 U.S. Census summary data).

slide-56
SLIDE 56

K-Anonymity

  • Approach: Generalize attributes so that each row in

the table cannot be distinguished from at least k-1

  • ther rows
  • Example for k=2: shorten ZIP code to first 3 digits

DOB Sex Zipcode Disease 1/21/76 Female 537* Heart Disease 4/13/86 Male 537* Hepatitis 2/28/76 Female 537* Brochitis 1/21/76 Female 537* Broken Arm 4/13/86 Male 537* Swine Flu 2/28/76 Male 537* Common Flu Name DOB Sex Zipcode Alice 1/21/76 Female 53715 Bob 1/10/81 Male 55410 Charlie 10/1/44 Male 90210 Dave 2/21/84 Male 02174 Ellen 4/19/72 Female 02237

Hospital Patient Data Vote Registration Data

slide-57
SLIDE 57

Location K-Anonymity

  • AS knows location of all users
  • Subdivides area until it

contains at less than k users

– Uses previous quadrant as „cloaking region“ in LBS query

Marco Gruteser Rutgers Univ. Dirk Grunwald

  • Univ. of Colorado

Anonymizer Service (AS) LBS LBS LBS

Gruteser, M. and Grunwald, D. Anonymous Usage of Location-Based Services Through Spatial and Temporal Cloaking. In Proc.of MobiSys 2003. ACM, pp 31-42

slide-58
SLIDE 58

Location K-Anonymity – Issues

  • Global or individual k?
  • Quality of Service (QoS) degradation?
  • Random cloaking regions allow inference of

true location if repated queries occur

  • Postprocessing required on client (e.g.,

routing)

  • Does not hide true location of user!

– Protects agains observation identification attack

slide-59
SLIDE 59

Greatly Varying Obfuscation Areas

Example: k=100

Industrial Area on Weekend Promenade on Weekend Weekend Train

slide-60
SLIDE 60

Location Obfuscation

  • Adding noise, pertubation, dummy traffic to location data

– Protects against attackers, but degrades service use – (Krumm, 2007) showed that LOTS of obfuscation is needed – Typically combined with rules to selectively adjust accuracy

Image Source: Krumm, J., Inference Attacks on Location Tracks, in Fifth International Conference

  • n Pervasive Computing (Pervasive 2007). 2007: Toronto, Ontario Canada. p. 127-143.
slide-61
SLIDE 61

Track Obfuscation

  • Location tracks more difficult to fake! Requires

– Believable speeds (existing speed limits) – Realistic start/end-points, trip times (duration, days) – Suboptimal routes (human driver vs. route planner) – Expected GPS noise (higher in urban environments)

Krumm, J., Realistic Driving Tracks for Location Privacy. In 7th International Conference on Pervasive Computing (Pervasive 2009), Nara, Japan, Springer.

slide-62
SLIDE 62

SUMMING UP

Img src: www.flickr.com/photos/nomeacuerdo/431060441/

slide-63
SLIDE 63

Take Home Message

  • Privacy is Not Just Secrecy/Seclusion!

– Privacy is a process, not a state

  • Basic Challenges of Location Privacy Tech

– Disassociating “Who?”, “When?”, “Where?” – Observation Identification Attack – Restricted Space Identification Attack

  • Technical Approaches

– Transparency: Policy and User Interfaces – Opacity: K-Anonymity and Obfuscation – Usability! Economic Viability!

slide-64
SLIDE 64

Further Issues

  • Legal Issues

– 9-1-1, GPS, Mobile Phone Tracking Ruling (US) – Data Protection, E-privacy, Retention (EU)

  • Location And Activity Data Mining

– citysense.com (MIT), cenceme.org (Dartmouth) – FP7: GeoPKDD.eu, MODAP Coordinated Action

  • Location Sharing Practices (Ethnography)

– Reno (Iachello et al. ‘05), Whereabouts Clock (Sellen et al. ‘06), Connecto (Barkhuus et al. ‘08)

slide-65
SLIDE 65

Techno Fallacies, Beware!

  • The Objectivity Of Numbers
  • Data Means Knowledge
  • More Data Means More Knowledge
  • If It Is In The Computer, It Must Be Right
  • If You Have Nothing To Hide, There’s No

Danger

  • Less Data Means More Privacy

Technology Is Neither Good Nor Bad. Nor Is It Neutral

Melvin C. Kranzberg

See, e.g., Gary Marx: Rocky Bottoms and Some Information Age Techno-Fallacies. Intl. Political Sociology, Vol. 1, No. 1. March 2007, pp. 83-110.

Irwin Altman University of Utah Melvin C. Kranzberg Georgia Tech (1917-1995) Gary T. Marx MIT

slide-66
SLIDE 66

General Reading

  • David Brin: The Transparent
  • Society. Perseus Publishing, 1999
  • Simson Garfinkel: Database

Nation – The Death of Privacy in the 21st Century. O’Reilly, 2001

  • Lawrence Lessig: Code and Other

Laws of Cyberspace. Basic Books, 2006 http://codev2.cc/

slide-67
SLIDE 67

Privacy and Technology

  • Deborah Estrin (ed.): Embedded, Every-

where: A Research Agenda for Networked Systems of Embedded Computers. National Academies Press, 2001.

http://www.nap.edu/openbook.php?isbn=0309075688

  • Waldo, Lin, Millett (eds.): Engaging

Privacy and Information Technology in a Digital Age. National Academies Press, 2007.

  • Wright, Gutwirth, Friedewald, et al.:

Safeguards in a World of Ambient

  • Intelligence. Springer, 2008
slide-68
SLIDE 68

SHAMELESS PLUG

slide-69
SLIDE 69

To Read

  • John Krumm (ed): Ubiquitous

Computing Fundamentals. Taylor & Francis, 2009

  • With Contributions From:

– Roy Want – Jakob Bardram and Adrian Friday – Marc Langheinrich – A.J. Bernheim Brush – Alex S. Taylor – Aaron Quigley – Alexander Varshavsky and Shwetak Patel – Anind K. Dey – John Krumm

slide-70
SLIDE 70