Design for Security Serena Chen | @Sereeena | OReilly Velocity 2018 - - PowerPoint PPT Presentation

design for security
SMART_READER_LITE
LIVE PREVIEW

Design for Security Serena Chen | @Sereeena | OReilly Velocity 2018 - - PowerPoint PPT Presentation

Design for Security Serena Chen | @Sereeena | OReilly Velocity 2018 Usability Security Good user experience design and good security cannot exist without each other Everyone deserves to be secure without being experts We need to


slide-1
SLIDE 1

Design for Security

Serena Chen | @Sereeena | O’Reilly Velocity 2018

slide-2
SLIDE 2

slide-3
SLIDE 3

Usability Security

slide-4
SLIDE 4

Good user experience design and good security cannot exist without each

  • ther
slide-5
SLIDE 5

Everyone deserves to be secure without being experts

slide-6
SLIDE 6

We need to stop expecting people to become security experts

slide-7
SLIDE 7

–Everyone not watching Mr Robot right now

“I don’t care about security.”

slide-8
SLIDE 8

–MCGRAW, G., FELTEN, E., AND MACMICHAEL, R. 


Securing Java: getting down to business with mobile code. Wiley Computer Pub., 1999

“Given a choice between dancing pigs and security, the user will pick dancing pigs every time.”

slide-9
SLIDE 9

–Serena Chen, not allowed pets in her apartment

“Given a choice between dancing pigs and security, the user will pick dancing pigs every time.”

CATS CATS

slide-10
SLIDE 10
slide-11
SLIDE 11
slide-12
SLIDE 12
slide-13
SLIDE 13
slide-14
SLIDE 14
slide-15
SLIDE 15

slide-16
SLIDE 16

Shaming people is lazy

slide-17
SLIDE 17

Obligatory xkcd: https://xkcd.com/149/

slide-18
SLIDE 18

–Everyone not watching Mr Robot right now

“I don’t care about security.”

slide-19
SLIDE 19

–Serena Chen, lone nerd screaming into the void

“I care!!!”

slide-20
SLIDE 20
slide-21
SLIDE 21
slide-22
SLIDE 22
slide-23
SLIDE 23
slide-24
SLIDE 24

Design thinking is another tool in the problem solving tool belt

slide-25
SLIDE 25

For your consideration:

1. 2. 3. 4.

slide-26
SLIDE 26

For your consideration:

  • 1. Paths of Least Resistance

2. 3. 4.

slide-27
SLIDE 27

Paths of Least Resistance

slide-28
SLIDE 28
slide-29
SLIDE 29
slide-30
SLIDE 30
slide-31
SLIDE 31

To stop internet, press firmly

slide-32
SLIDE 32
slide-33
SLIDE 33

Consider the 
 “secure by default” principle

slide-34
SLIDE 34
slide-35
SLIDE 35
slide-36
SLIDE 36

Normalise security

slide-37
SLIDE 37
slide-38
SLIDE 38

Group similar tasks

slide-39
SLIDE 39

People are lazy efficient

slide-40
SLIDE 40

Align your goals with the end user’s goals

slide-41
SLIDE 41
slide-42
SLIDE 42

“I KNOW HOW TO INTERNET”

slide-43
SLIDE 43

“I KNOW HOW TO INTERNET”

—Serena Chen, 
 a Real Human Adult™

slide-44
SLIDE 44

“I KNOW HOW TO INTERNET”

—Serena Chen, 
 a Real Human Adult™

slide-45
SLIDE 45

Path of (Perceived) Least Resistance

slide-46
SLIDE 46

–S. Breznitz and C. Wolf. The psychology of false alarms. 


Lawrence Erbaum Associates, NJ, 1984

“Each false alarm reduces the credibility

  • f a warning system.”
slide-47
SLIDE 47

Anderson et al. How polymorphic warnings reduce habituation in the brain: Insights from an fMRI study. In Proceedings of CHI, 2015

slide-48
SLIDE 48

Shadow IT is a massive vulnerability

slide-49
SLIDE 49
slide-50
SLIDE 50
slide-51
SLIDE 51
slide-52
SLIDE 52

Illustration by Megan Pendergrass

slide-53
SLIDE 53

Fixing bad paths

  • Use security tools for security concerns, not

management concerns

  • If you block enough non-threats, people

will get really good at subverting your security

slide-54
SLIDE 54

Building good paths

  • Don’t make me think!
  • Make the secure path the easiest path
  • e.g. BeyondCorp model at Google
slide-55
SLIDE 55

“We designed our tools so that the user- facing components are clear and easy to

  • use. […] For the vast majority of users,

BeyondCorp is completely invisible.

–V. M. Escobedo, F. Zyzniewski, B. (A. E.) Beyer, M. Saltonstall,

“BeyondCorp: The User Experience”, Login, 2017

slide-56
SLIDE 56
slide-57
SLIDE 57

Align your goals with the end user’s goals

slide-58
SLIDE 58

For your consideration:

  • 1. Paths of Least Resistance

2. 3. 4.

slide-59
SLIDE 59

For your consideration:

  • 1. Paths of Least Resistance
  • 2. Intent

3. 4.

slide-60
SLIDE 60

Intent

slide-61
SLIDE 61

Tension between usability and security happens when we cannot accurately determine intent.

slide-62
SLIDE 62

“make it easy” “lock it down”

slide-63
SLIDE 63

It is not our job to make everything easy

slide-64
SLIDE 64

It is not our job to make everything locked down

slide-65
SLIDE 65

Our job is to make a specific action

  • that a specific user wants to take
  • at that specific time
  • in that specific place

…easy Everything else we can lock down.

slide-66
SLIDE 66

Knowing intent = usability and security without compromise

slide-67
SLIDE 67
slide-68
SLIDE 68
slide-69
SLIDE 69
slide-70
SLIDE 70
slide-71
SLIDE 71

For your consideration:

  • 1. Paths of Least Resistance
  • 2. Intent

3. 4.

slide-72
SLIDE 72

For your consideration:

  • 1. Paths of Least Resistance
  • 2. Intent
  • 3. (Mis)communication

4.

slide-73
SLIDE 73

(Mis)communication

slide-74
SLIDE 74

Wherever there is a miscommunication, there exists a human security vulnerability.

slide-75
SLIDE 75

What are you unintentionally miscommunicating?

slide-76
SLIDE 76
slide-77
SLIDE 77

Wherever there is a miscommunication, there exists a human security vulnerability.

slide-78
SLIDE 78
slide-79
SLIDE 79
slide-80
SLIDE 80
slide-81
SLIDE 81
slide-82
SLIDE 82

(I didn’t actually do this)

slide-83
SLIDE 83

https://security.googleblog.com/2018/02/a-secure-web-is-here-to-stay.html

slide-84
SLIDE 84

Do your end users know 
 what you’re trying to communicate?

slide-85
SLIDE 85

What is their mental model

  • f what’s happening,

compared to yours?

slide-86
SLIDE 86

For your consideration:

  • 1. Intent
  • 2. Path of Least Resistance
  • 3. (Mis)communication

4.

slide-87
SLIDE 87

For your consideration:

  • 1. Intent
  • 2. Path of Least Resistance
  • 3. (Mis)communication
  • 4. Mental model matching
slide-88
SLIDE 88

Mental models

slide-89
SLIDE 89

It’s the user’s expectations that define whether a system is secure or not.

slide-90
SLIDE 90
slide-91
SLIDE 91
slide-92
SLIDE 92

–Ka-Ping Yee, “User Interaction Design for Secure Systems”, 


  • Proc. 4th Int’l Conf. Information and Communications Security, Springer-Verlag, 2002

“A system is secure from a given user’s perspective if the set of actions that each actor can do are bounded by what the user believes it can do.”

slide-93
SLIDE 93

Find their model, match to that Influence their model, match to system

+

slide-94
SLIDE 94

Find their model

  • Go to customer sessions!
  • Observe end users
  • Infer intent through context
slide-95
SLIDE 95

Influence their model

  • When we make, we teach
  • Whenever someone interacts with us / 


a thing we made, they learn.

  • Path of least resistance becomes the default

“way to do things”.

slide-96
SLIDE 96

How are we already influencing users’ models?

slide-97
SLIDE 97

https://krausefx.com/blog/ios-privacy-stealpassword-easily-get-the-users-apple-id-password-just-by-asking

iOS Phish

slide-98
SLIDE 98

What are we teaching?

slide-99
SLIDE 99

“I KNOW HOW TO INTERNET”

—Serena Chen, 
 a Real Human Adult™

slide-100
SLIDE 100
slide-101
SLIDE 101

Understand end user mental models

slide-102
SLIDE 102
slide-103
SLIDE 103

What are your users’ mental models?

slide-104
SLIDE 104

Review

slide-105
SLIDE 105
slide-106
SLIDE 106

Takeaways

  • Cross pollination is rare. This is a missed
  • pportunity!
  • Our jobs are about outcomes based on our

specific goals

  • Align the user’s goals to your security goals
slide-107
SLIDE 107

Takeaways

  • Aim to know their intent
  • Collaborate with design to craft secure

paths of least resistance

  • Understand their mental model vs yours
  • Communicate to that model
slide-108
SLIDE 108

One final anecdote…

slide-109
SLIDE 109
slide-110
SLIDE 110
slide-111
SLIDE 111
slide-112
SLIDE 112

Thanks!

Fight me @Sereeena