User education, mental models, and risk factors Michelle Mazurek - - PowerPoint PPT Presentation

user education mental models and risk factors
SMART_READER_LITE
LIVE PREVIEW

User education, mental models, and risk factors Michelle Mazurek - - PowerPoint PPT Presentation

User education, mental models, and risk factors Michelle Mazurek With material from Lorrie Cranor 1 Today User education Risk factors: demographics and behavior Folk models of security Activity 2 Case study: PhishGuru and


slide-1
SLIDE 1

1

User education, mental models, and risk factors

Michelle Mazurek With material from Lorrie Cranor

slide-2
SLIDE 2

2

Today

  • User education
  • Risk factors: demographics and behavior
  • Folk models of security
  • Activity
slide-3
SLIDE 3

3

USER EDUCATION

Case study: PhishGuru and Anti-Phishing Phil

slide-4
SLIDE 4

4

Challenges in user education

  • Users are not motivated
  • Security is a secondary task
  • Risk of increasing false positives

– All you do is make them more paranoid

slide-5
SLIDE 5

5

Is user education possible?

  • Security education “puts the burden on the

wrong shoulders.”1

  • “Security user education is a myth.”2
  • “User education is a complete waste of time. It is

about as much use as nailing jelly to a wall…. They are not interested.”3

  • 1. J. Nielson, 2004. User education is not the answer to security problems.
  • 2. S. Gorling, 2006. The myth of user education.
  • 3. M. Overton, http://news.cnet.com/2100-7350_3-6125213-2.html
slide-6
SLIDE 6

6

Evaluating existing training (2007)

  • Lab study: 28 non-experts
  • Evaluate 10 sites, 15 min break, 10 more sites

– Control break: read email, play solitaire – Experiment break: read training materials

  • Experimental group: Better after training

– But more false positives

  • P. Kumaraguru, S. Sheng, A. Acquisti, L. Cranor, and J. Hong.

Teaching Johnny Not to Fall for Phish. ACM TOIT, May 2010.

slide-7
SLIDE 7

7

Maybe we can nail jelly after all…

http://graeme.woaf.net/otherbits/jelly.html

slide-8
SLIDE 8

8

PhishGuru and learning science

  • Send email that looks like phishing
  • If participant falls for it, redirect to intervention
slide-9
SLIDE 9

9 20

slide-10
SLIDE 10

10 21

Applies&learningTbyTdoing& and&immediate&feedback& principles&

slide-11
SLIDE 11

11 22

Applies&storyTbased&agent& principle&

slide-12
SLIDE 12

12 23

Applies&con.guity&principle& Presents&procedural&knowledge&

slide-13
SLIDE 13

13 24

Applies&personaliza.on&principle& Presents&conceptual&knowledge&

slide-14
SLIDE 14

14 20

slide-15
SLIDE 15

15

Evaluating PhishGuru

  • Two lab studies, field study
  • Second lab study: roleplay and respond to email

– Part 1: 16 emails, training, 16 more – Part 2: 16 emails (7 days later)

  • Training: four conditions

– Email from friend, no relevant content – Email from friend mentions phishing – PhishGuru cartoon in email – PhishGuru embedded (when click on link)

slide-16
SLIDE 16

16

Results: Identifying phishing emails

0.04& 0.14& 0.18& 0.68& 0.64& 0.07& 0& 0.2& 0.4& 0.6& 0.8& 1& Before&& Immediate& Delay&& Mean&correctness& NonTembedded& Embedded& Control& Suspicion&

slide-17
SLIDE 17

17

Embedded helps!

0.04& 0.18& 0.68& 0.64& 0.07& 0& 0.2& 0.4& 0.6& 0.8& 1& Before&& Immediate& Delay&& Mean&correctness& NonTembedded& Embedded& Control& Suspicion&

Learning Retention

slide-18
SLIDE 18

18

1.00& 0.96& 0.86& 0.96& 0.96& 0.89& 0.96& 0.00& 0.20& 0.40& 0.60& 0.80& 1.00& Before& Immediate& Delay& Mean&correctness&

Results: Legitimate links

slide-19
SLIDE 19

19

Field study at CMU

  • Investigate retention at 1, 2, and 4 weeks
  • Opt-in to all students, faculty, staff (N=515)
  • 28-day period:

– 7 simulated phishing – 3 legitimate ISO messages – Exit survey

  • Unique hash in link per participant

– Collect demographic data

  • P. Kumaraguru, J. Cranshaw, A. Acquisti, L. Cranor, J. Hong, M. A. Blair, and T. Pham.

School of Phish: A Real-World Evaluation of Anti-Phishing Training. SOUPS 2009.

slide-20
SLIDE 20

20

Email schedule

Day Control One training Two training Test and real Train and real Train and real 2 Test 7 Test and real 14 Test Test Train 16 Test 21 Test 28 Test and real 35 Exit survey

slide-21
SLIDE 21

21

Sample phishing email

URL&is&not&hidden& Plain&text&email& without&graphics&

slide-22
SLIDE 22

22

Phishing emails

From Subject ISO Bandwidth quota offer Networking Services Register for CMU’s annual networking event Webmaster Change Andrew password Enrollment Services Congratulations – Plaid Ca$h Sophie Jones Please register for the conference Community Service Volunteer at Community Service Links Help Desk Your Andrew password alert

slide-23
SLIDE 23

23

Results

  • Training è less clicking on phishing
  • Retained training for 28 days
  • Two trainings better than one
  • No increase in false positives
  • 80% recommended CMU continue

Condition N % clicked day 1 % clicked day 28 Control 172 52.3 44.2 Trained 343 48.4 24.5

slide-24
SLIDE 24

24

Anti-Phishing Phil

69

slide-25
SLIDE 25

25

Evaluation

  • 10 URLs before, 10 after, randomized

– In between, up to 15 min training

  • Two standard tutorials, game
slide-26
SLIDE 26

26

Results

  • All training made people more suspicious
  • No sig. difference in false negatives
  • Only game did not increase false positives
  • Lots of positive comments

– Online study with similar results

slide-27
SLIDE 27

27 81

slide-28
SLIDE 28

28

Why does it work?

  • Fun to play
  • People like to win things (even just points)
  • Fast – 10 minutes
  • Teaches actionable steps
slide-29
SLIDE 29

29

RISK FACTORS

CASE STUDY: MALWARE

Lalonde-Levesque et al., CCS 2013

slide-30
SLIDE 30

30

A “clinical study” of malware

  • Distributed laptops, 50 participants, 4 months

– Sold to participants below retail

  • Installed with a bunch of anti-virus
  • Configured to collect data about:

– Applications installed/updated – Browsing, downloads, browser plugins, wi-fi

  • Monthly sessions to check for infection

– Fill out survey – Collect additional infection data

slide-31
SLIDE 31

31

Results: Infections

  • 38% exposed within 4 months

– Consistent over all months – 17 via portable storage – 1 outlier with 28 unique detections

  • 18 threats on 10 machines

– 7 unwanted, 9 adware, 1 malware, 1 maybe – 1 fake AV scanner

slide-32
SLIDE 32

32

Risk factors

  • Risk of exposure (detection), not infection
  • Self-reported expertise è more exposure

– Nothing for gender, age, field

  • Behavioral factors for more exposure:

– Installing more applications, visiting more sites – Sites: mp3, peer-peer, gambling, sports, etc.

slide-33
SLIDE 33

33

MENTAL MODELS

Case Study: Viruses, hackers, advice

Wash, SOUPS 2010

slide-34
SLIDE 34

34

Folk models of security

  • Qualitative interviews, snowball sample
  • 23 in first round
  • 10 more to specifically test conclusions
slide-35
SLIDE 35

35

Viruses and malware

  • Viruses are bad; not much else
  • Viruses are buggy software

– Must be intentionally placed on computer

  • Viruses are mischievous, annoying

– Caught by visiting bad parts of internet

  • Viruses support crime

– Identity theft, not damage

slide-36
SLIDE 36

36

Hackers

  • Graffiti artists

– Target anyone, cause mischief, impress friends

  • Opportunistic criminals

– Steal identity/financials, targets of opportunity

  • Organized criminals

– Target important/rich people

  • Criminal contractors

– Hybrid of first and third

slide-37
SLIDE 37

37

Effect on security advice

Virus Models Hacker Models Viruses are Bad Buggy Software Mischief Support Crime Graffiti Burglar Big Fish Contractor 1. Use anti-virus software ?? xx ?? !! !! xx xx 2. Keep anti-virus updated xx xx ?? !! xx 3. Regularly scan computer with anti-virus xx xx ?? !! xx 4. Use security software (firewall, etc.) xx ?? ?? ?? ?? xx 5. Don’t click on attachments !! !! !! !! !! !! 6. Be careful downloading from websites ?? !! ?? !! ?? ?? xx xx 7. Be careful which websites you visit xx !! ?? !! !! ?? !! 8. Disable scripting in web and email xx 9. Use good passwords ?? ?? xx 10. Make regular backups ?? !! xx !! xx xx xx 11. Keep patches up to date ?? xx !! !! !! xx xx 12. Turn off computer when not in use xx xx !! ?? !! xx xx

!! = very important xx = unnecessary ?? = maybe helpful, not too important

slide-38
SLIDE 38

38

EDUCATION DESIGN

Avoiding scams on social media

slide-39
SLIDE 39

39

In groups: Train users to avoid social media scams

  • Get free stuff, shocking videos, hoax news
  • Use learning science principles

– Learning by doing, immediate feedback, agent/story, contiguity (words + pictures), personalization (I/we/ you), conceptual + procedural – See required reading #2 for more

  • Consider mental models
  • Explain context/delivery, sketch the intervention