Privacy Expectations and Preferences in an IoT World Pardis - - PowerPoint PPT Presentation

privacy expectations and preferences in an iot world
SMART_READER_LITE
LIVE PREVIEW

Privacy Expectations and Preferences in an IoT World Pardis - - PowerPoint PPT Presentation

Privacy Expectations and Preferences in an IoT World Pardis Emami-Naeini, Sruti Bhagavatula, Martin Degeling, Hana Habib, Lujo Bauer, Lorrie Faith Cranor, Norman Sadeh PERSONALIZED PRIVACY ASSISTANT PROJECT 1 PERSONALIZED PRIVACY ASSISTANT


slide-1
SLIDE 1

PERSONALIZED PRIVACY ASSISTANT PROJECT

1

Privacy Expectations and Preferences in an IoT World

Pardis Emami-Naeini, Sruti Bhagavatula, Martin Degeling, Hana Habib, Lujo Bauer, Lorrie Faith Cranor, Norman Sadeh

slide-2
SLIDE 2

PERSONALIZED PRIVACY ASSISTANT PROJECT

2

slide-3
SLIDE 3

PERSONALIZED PRIVACY ASSISTANT PROJECT

3

Do you know:

§ What are they collecting? § Who are they sharing your data with? § For how long are they keeping your data?

slide-4
SLIDE 4

PERSONALIZED PRIVACY ASSISTANT PROJECT

4

Now imagine the future

§ What are they collecting? § Who are they sharing my data with? § For how long are they keeping my data?

slide-5
SLIDE 5

PERSONALIZED PRIVACY ASSISTANT PROJECT

5

Design questions

§ Informing people about data collections

  • What should we notify people about?

§ Giving some choices to control privacy

  • What factors make people comfortable?
  • What factors make people allow/deny a data

collection?

slide-6
SLIDE 6

PERSONALIZED PRIVACY ASSISTANT PROJECT

6

Design questions

§ Making the system automated

  • How well can we predict people preferences?
slide-7
SLIDE 7

PERSONALIZED PRIVACY ASSISTANT PROJECT

7

Vignette study

§ Capture wide range of scenarios § Stories about individuals, situations and structures which can make reference to important points in the study of perceptions, beliefs and attitudes (Hughes 1998)

slide-8
SLIDE 8

PERSONALIZED PRIVACY ASSISTANT PROJECT

8

Scenarios varied by 8 factors

  • Type of data collected
  • Location of data collection
  • Device collecting data
  • Retention time
  • Purpose of data collection
  • Who benefits from data collection
  • Whether or not data is shared
  • Whether more info could be inferred
slide-9
SLIDE 9

PERSONALIZED PRIVACY ASSISTANT PROJECT

9

Example scenario

§ You are at [work]. This building has [cameras] that are recording [video of the entire building]. The video is [shared with law enforcement] to [improve public safety] and they [will not delete it].

slide-10
SLIDE 10

PERSONALIZED PRIVACY ASSISTANT PROJECT

10

Example scenario

§ You are at a [department store]. This store has an [iris scanner] that scans customers' irises automatically as they enter the store in order to [remotely identify returning customers]. Your iris scan will be kept for [one week].

slide-11
SLIDE 11

PERSONALIZED PRIVACY ASSISTANT PROJECT

11

Studied 380 IoT scenarios

126,720

scenarios No nonsense 380 scenarios 14 14 14

slide-12
SLIDE 12

PERSONALIZED PRIVACY ASSISTANT PROJECT

12

Our participants

§ 1007 Mechanical Turk participants § From the United States § Avg. age: 35.3 § ~15 minutes to complete

slide-13
SLIDE 13

PERSONALIZED PRIVACY ASSISTANT PROJECT

13

Questions per scenario

§ I would want my mobile phone to notify me [every time /

  • nly the first time / every once in a while] this data

collection occurs.

  • five point scale from “strongly agree” to “strongly disagree”
slide-14
SLIDE 14

PERSONALIZED PRIVACY ASSISTANT PROJECT

14

Questions per scenario

§ How would you feel about the data collection in the situation described above if you were given no additional information about the scenario?

  • five point scale from “very comfortable to “very uncomfortable”

§ If you had the choice, would you allow or deny this data collection?

  • Choices: allow, deny
slide-15
SLIDE 15

PERSONALIZED PRIVACY ASSISTANT PROJECT

15

Model selection

§ GLMM + random intercept § Backward elimination

slide-16
SLIDE 16

PERSONALIZED PRIVACY ASSISTANT PROJECT

16

Model: Every time notification

§ Most impactful explanatory factor:

  • Biometrics for an unspecified purpose (coef: 0.88, 61%)
  • Presence for a not beneficial purpose (coef: -0.49, 27%)

§ Least impactful explanatory factor:

  • data collected at a department store (coef: -0.69, 42%)
slide-17
SLIDE 17

PERSONALIZED PRIVACY ASSISTANT PROJECT

17

Model: Comfort level

§ Most impactful explanatory factor:

  • Video collection happening today (coef: 1.39, 69%)
  • Biometrics (coef: -1.45, 28%)

§ Least impactful explanatory factor:

  • Data being kept forever (coef: 0.10, 48%)
slide-18
SLIDE 18

PERSONALIZED PRIVACY ASSISTANT PROJECT

18

Model: Desire to Allow/Deny

§ Most impactful explanatory factor:

  • Video collected at department store (coef: -0.9, 66%)
  • Presence collected at work (coef: 2.11, 36%)

§ Least impactful explanatory factor:

  • Data being shared (coef: 0.52, 45%)
slide-19
SLIDE 19

PERSONALIZED PRIVACY ASSISTANT PROJECT

19

Prediction accuracy

§ Comfort level:

  • ~81%

§ Desire to allow or deny:

  • ~79%
slide-20
SLIDE 20

PERSONALIZED PRIVACY ASSISTANT PROJECT

20

Preferences in a nutshell

§ Anonymous data types

“I’d be fine with data that doesn’t identify me.”

§ Public vs. private

“[I would be] comfortable with public spaces, absolutely not comfortable in my home.”

slide-21
SLIDE 21

PERSONALIZED PRIVACY ASSISTANT PROJECT

21

Preferences in a nutshell

§ Ranked 1st = Type of data + X

  • Notification X = user perceived benefit
  • comfort

X = happening today

  • allow/deny

X = location

slide-22
SLIDE 22

PERSONALIZED PRIVACY ASSISTANT PROJECT

22

Our results design

§ Design personalized privacy systems § In progress: experience sampling

More info: www.privacyassistant.org Pardis Emami-Naeini, Sruti Bhagavatula, Martin Degeling, Hana Habib, Lujo Bauer, Lorrie Faith Cranor, Norman Sadeh