1
User education, mental models, and risk factors Michelle Mazurek - - PowerPoint PPT Presentation
User education, mental models, and risk factors Michelle Mazurek - - PowerPoint PPT Presentation
User education, mental models, and risk factors Michelle Mazurek With material from Lorrie Cranor 1 Today User education Risk factors: demographics and behavior Folk models of security Activity 2 Case study: PhishGuru and
2
Today
- User education
- Risk factors: demographics and behavior
- Folk models of security
- Activity
3
USER EDUCATION
Case study: PhishGuru and Anti-Phishing Phil
4
Challenges in user education
- Users are not motivated
- Security is a secondary task
- Risk of increasing false positives
– All you do is make them more paranoid
5
Is user education possible?
- Security education “puts the burden on the
wrong shoulders.”1
- “Security user education is a myth.”2
- “User education is a complete waste of time. It is
about as much use as nailing jelly to a wall…. They are not interested.”3
- 1. J. Nielson, 2004. User education is not the answer to security problems.
- 2. S. Gorling, 2006. The myth of user education.
- 3. M. Overton, http://news.cnet.com/2100-7350_3-6125213-2.html
6
Evaluating existing training (2007)
- Lab study: 28 non-experts
- Evaluate 10 sites, 15 min break, 10 more sites
– Control break: read email, play solitaire – Experiment break: read training materials
- Experimental group: Better after training
– But more false positives
- P. Kumaraguru, S. Sheng, A. Acquisti, L. Cranor, and J. Hong.
Teaching Johnny Not to Fall for Phish. ACM TOIT, May 2010.
7
Maybe we can nail jelly after all…
http://graeme.woaf.net/otherbits/jelly.html
8
PhishGuru and learning science
- Send email that looks like phishing
- If participant falls for it, redirect to intervention
9 20
10 21
Applies&learningTbyTdoing& and&immediate&feedback& principles&
11 22
Applies&storyTbased&agent& principle&
12 23
Applies&con.guity&principle& Presents&procedural&knowledge&
13 24
Applies&personaliza.on&principle& Presents&conceptual&knowledge&
14 20
15
Evaluating PhishGuru
- Two lab studies, field study
- Second lab study: roleplay and respond to email
– Part 1: 16 emails, training, 16 more – Part 2: 16 emails (7 days later)
- Training: four conditions
– Email from friend, no relevant content – Email from friend mentions phishing – PhishGuru cartoon in email – PhishGuru embedded (when click on link)
16
Results: Identifying phishing emails
0.04& 0.14& 0.18& 0.68& 0.64& 0.07& 0& 0.2& 0.4& 0.6& 0.8& 1& Before&& Immediate& Delay&& Mean&correctness& NonTembedded& Embedded& Control& Suspicion&
17
Embedded helps!
0.04& 0.18& 0.68& 0.64& 0.07& 0& 0.2& 0.4& 0.6& 0.8& 1& Before&& Immediate& Delay&& Mean&correctness& NonTembedded& Embedded& Control& Suspicion&
Learning Retention
18
1.00& 0.96& 0.86& 0.96& 0.96& 0.89& 0.96& 0.00& 0.20& 0.40& 0.60& 0.80& 1.00& Before& Immediate& Delay& Mean&correctness&
Results: Legitimate links
19
Field study at CMU
- Investigate retention at 1, 2, and 4 weeks
- Opt-in to all students, faculty, staff (N=515)
- 28-day period:
– 7 simulated phishing – 3 legitimate ISO messages – Exit survey
- Unique hash in link per participant
– Collect demographic data
- P. Kumaraguru, J. Cranshaw, A. Acquisti, L. Cranor, J. Hong, M. A. Blair, and T. Pham.
School of Phish: A Real-World Evaluation of Anti-Phishing Training. SOUPS 2009.
20
Email schedule
Day Control One training Two training Test and real Train and real Train and real 2 Test 7 Test and real 14 Test Test Train 16 Test 21 Test 28 Test and real 35 Exit survey
21
Sample phishing email
URL&is¬&hidden& Plain&text&email& without&graphics&
22
Phishing emails
From Subject ISO Bandwidth quota offer Networking Services Register for CMU’s annual networking event Webmaster Change Andrew password Enrollment Services Congratulations – Plaid Ca$h Sophie Jones Please register for the conference Community Service Volunteer at Community Service Links Help Desk Your Andrew password alert
23
Results
- Training è less clicking on phishing
- Retained training for 28 days
- Two trainings better than one
- No increase in false positives
- 80% recommended CMU continue
Condition N % clicked day 1 % clicked day 28 Control 172 52.3 44.2 Trained 343 48.4 24.5
24
Anti-Phishing Phil
69
25
Evaluation
- 10 URLs before, 10 after, randomized
– In between, up to 15 min training
- Two standard tutorials, game
26
Results
- All training made people more suspicious
- No sig. difference in false negatives
- Only game did not increase false positives
- Lots of positive comments
– Online study with similar results
27 81
28
Why does it work?
- Fun to play
- People like to win things (even just points)
- Fast – 10 minutes
- Teaches actionable steps
29
RISK FACTORS
CASE STUDY: MALWARE
Lalonde-Levesque et al., CCS 2013
30
A “clinical study” of malware
- Distributed laptops, 50 participants, 4 months
– Sold to participants below retail
- Installed with a bunch of anti-virus
- Configured to collect data about:
– Applications installed/updated – Browsing, downloads, browser plugins, wi-fi
- Monthly sessions to check for infection
– Fill out survey – Collect additional infection data
31
Results: Infections
- 38% exposed within 4 months
– Consistent over all months – 17 via portable storage – 1 outlier with 28 unique detections
- 18 threats on 10 machines
– 7 unwanted, 9 adware, 1 malware, 1 maybe – 1 fake AV scanner
32
Risk factors
- Risk of exposure (detection), not infection
- Self-reported expertise è more exposure
– Nothing for gender, age, field
- Behavioral factors for more exposure:
– Installing more applications, visiting more sites – Sites: mp3, peer-peer, gambling, sports, etc.
33
MENTAL MODELS
Case Study: Viruses, hackers, advice
Wash, SOUPS 2010
34
Folk models of security
- Qualitative interviews, snowball sample
- 23 in first round
- 10 more to specifically test conclusions
35
Viruses and malware
- Viruses are bad; not much else
- Viruses are buggy software
– Must be intentionally placed on computer
- Viruses are mischievous, annoying
– Caught by visiting bad parts of internet
- Viruses support crime
– Identity theft, not damage
36
Hackers
- Graffiti artists
– Target anyone, cause mischief, impress friends
- Opportunistic criminals
– Steal identity/financials, targets of opportunity
- Organized criminals
– Target important/rich people
- Criminal contractors
– Hybrid of first and third
37
Effect on security advice
Virus Models Hacker Models Viruses are Bad Buggy Software Mischief Support Crime Graffiti Burglar Big Fish Contractor 1. Use anti-virus software ?? xx ?? !! !! xx xx 2. Keep anti-virus updated xx xx ?? !! xx 3. Regularly scan computer with anti-virus xx xx ?? !! xx 4. Use security software (firewall, etc.) xx ?? ?? ?? ?? xx 5. Don’t click on attachments !! !! !! !! !! !! 6. Be careful downloading from websites ?? !! ?? !! ?? ?? xx xx 7. Be careful which websites you visit xx !! ?? !! !! ?? !! 8. Disable scripting in web and email xx 9. Use good passwords ?? ?? xx 10. Make regular backups ?? !! xx !! xx xx xx 11. Keep patches up to date ?? xx !! !! !! xx xx 12. Turn off computer when not in use xx xx !! ?? !! xx xx
!! = very important xx = unnecessary ?? = maybe helpful, not too important
38
EDUCATION DESIGN
Avoiding scams on social media
39
In groups: Train users to avoid social media scams
- Get free stuff, shocking videos, hoax news
- Use learning science principles
– Learning by doing, immediate feedback, agent/story, contiguity (words + pictures), personalization (I/we/ you), conceptual + procedural – See required reading #2 for more
- Consider mental models
- Explain context/delivery, sketch the intervention