Effective Security for the Post-compliance Era Security Awareness - - PowerPoint PPT Presentation

effective security for the post compliance era security
SMART_READER_LITE
LIVE PREVIEW

Effective Security for the Post-compliance Era Security Awareness - - PowerPoint PPT Presentation

Effective Security for the Post-compliance Era Security Awareness and Training for Security Specialists M. ANGELA SASSE, RUHR UNIVERSITY BOCHUM & UCL The problem MENSCHLICH WELTOFFEN LEISTUNGSSTARK 2 ENISA Report December 2018


slide-1
SLIDE 1

Effective Security for the Post-compliance Era Security Awareness and Training for Security Specialists

  • M. ANGELA SASSE, RUHR UNIVERSITY BOCHUM & UCL
slide-2
SLIDE 2

MENSCHLICH – WELTOFFEN – LEISTUNGSSTARK 2

The problem

slide-3
SLIDE 3

MENSCHLICH – WELTOFFEN – LEISTUNGSSTARK 3

ENISA Report December 2018 Co-authored with Adam Joinson (Bath University) & Thomas Schlienger (Tree Solutions)

slide-4
SLIDE 4

MENSCHLICH – WELTOFFEN – LEISTUNGSSTARK 4

“the lack of information security awareness, ignorance, negligence, apathy, mischief, and resistance are the root of users’ mistakes”

Typical characterisation of users

Safa N. S., Von Solms, R. & Furnell, S. Information security policy compliance model in organizations. Computers & Security, Volume 56, February 2016, Pages 70-82.

slide-5
SLIDE 5

MENSCHLICH – WELTOFFEN – LEISTUNGSSTARK 5

“There was little evidence that there are specific links between types of people (e.g. gender, personality) and security behaviours.”

slide-6
SLIDE 6

MENSCHLICH – WELTOFFEN – LEISTUNGSSTARK 6

Self-efficacy

“self-efficacy was a reliable, moderately strong predictor of cyber-security intention and behaviour. This suggests that interventions that seek to improve users’ ability to respond appropriately to cyber-threats (and belief that those responses will be effective) is more likely to yield positive results than campaigns based around stressing the threat.” Which is why FUD is so counter-productive …

slide-7
SLIDE 7

“by systematically approaching and analysing the current cybersecurity stance of the organisation, and carrying out an in-depth analysis of the causes of any problem(s), ENISA proposes that practitioners can take significant steps towards helping employees to act in a more secure way. This may involve skills-based training and support but may also require the restructuring of security practises and policies to better align with people’s workplace goals and/or capabilities.”

7

Stop trying to fix people – fix your security

“We cannot change the human condition – but we can change the condition under which humans work.” James Reason

slide-8
SLIDE 8

8

Problem 1: Impossible security

“Never give an order that can’t be obeyed” General Douglas MacArthur 1880-1964

slide-9
SLIDE 9

MENSCHLICH – WELTOFFEN – LEISTUNGSSTARK 9

slide-10
SLIDE 10

MENSCHLICH – WELTOFFEN – LEISTUNGSSTARK 10

  • “152 steps to Cyber Security” (Reeder et al. 2017): Security

experts cannot agree on 3 most important behaviours

  • Stobert & Biddle 2016: Security experts don’t follow their own

advice on password management … and take risks they decry in users

  • Don’t get me started on CAPTCHAs and self-phishing … (more

later)

slide-11
SLIDE 11

“Sticky plaster security” gets in the way and zaps

  • Time
  • Attention
  • Goodwill

Thanks to Robert Watson Cambridge University Computer Lab

Problem 2: Productivity-destroying security

slide-12
SLIDE 12

12

slide-13
SLIDE 13

"About 50 of you so far marked this message coming from [subcontractor] as phishing”

Email from HR department in a company with around 600 employees that had hired a well-known, expensive supplier to conduct employee surveys.

NCSC guidance on measures to reduce phishing https://www.ncsc.gov.uk/phishing

slide-14
SLIDE 14

14

  • Security experts try to motivate behaviour change through fear

Another example: Devil’s in your details https://www.youtube.com/watch?v=Ugl8bmZF9Pc

Problem 3: Negative security

slide-15
SLIDE 15

MENSCHLICH – WELTOFFEN – LEISTUNGSSTARK 15

“FUD provides a steady stream of factoids (e.g. raw number of malware samples, activity on underground markets, or the number of users who will hand over their password for a bar of chocolate), the effect of which is to persuade us that things are bad and constantly getting worse. […] reliance on factoids leads government and industry to spend wastefully and researchers to focus on the wrong questions.”

FUD is damaging – not just to users

  • D. Florencio, C. Herley, A. Shostack (2014): FUD: A Plea for Intolerance. Communications ACM June 2014.
slide-16
SLIDE 16
  • Many studies attempting to

‘nudge’ users towards security

  • Based on behavioural economics,

but overlooks key aspects of that theory – the nudgee has to want goal and agree on way of getting there …

Problem 4: Paternalistic security

slide-17
SLIDE 17
  • Disruptive, creating stress,

anger, other negative emotions

Problem 5: Annoying security

slide-18
SLIDE 18

XKCD https://xkcd.com/1837/

slide-19
SLIDE 19

Problem 6: “Security awareness“

slide-20
SLIDE 20

MENSCHLICH – WELTOFFEN – LEISTUNGSSTARK 20

So - how do we fix security?

  • 1. Make security possible - usability

is essential, not a luxury.

  • 2. Security education cannot fix

security people don’t want.

  • 3. Transforming user behaviour

requires serious long-term work

  • 4. All the good guys need to work

together, not against each other.

slide-21
SLIDE 21

21 Usable Security and Privacy: Human Factors

Back to the future – the ‘Kerkhoffs Test’

  • 1. The system must be substantially, if not mathematically, undecipherable;
  • 2. The system must not require secrecy and can be stolen by the enemy

without causing trouble;

  • 3. It must be easy to communicate and remember the keys without requiring

written notes, it must also be easy to change or modify the keys with different participants;

  • 4. The system ought to be compatible with telegraph communication;
  • 5. The system must be portable, and its use must not require more than one

person;

  • 6. Finally, regarding the circumstances in which such system is applied, it must

be easy to use and must neither require stress of mind nor the knowledge

  • f a long series of rules.

Auguste Kerckhoffs, ‘La cryptographie militaire’, Journal des sciences militaires, vol. IX, pp. 5–38, Jan. 1883, pp. 161–191, Feb. 1883.

slide-22
SLIDE 22

22

Positive Security that people want

  • From ‘freedom from’ (threats) to ‘freedom to’ (do the things people

want)

  • putting user goals and values first
  • positive language and genuine enablement
  • Employ basic usability principles and methods
  • commitment to understanding performance and employing

metrics

  • inclusive design
slide-23
SLIDE 23

23

12 Everyday Security

to housing. In order to provide this service, the local authority may have data sharing agreements with private landlords, with social housing institutions, with third sector charities and with other departments across the local authority. Each one of these institutions may have details on a particular tenant. Legislation and regulation related to the sharing of this information is complex and the technological infrastructure may also be complex. In order to ensure that a citizen has access to the appropriate housing services and is able to make an informed decision about their housing needs, the local authority worker may need to access a number

  • f different data sources in different institutions.

Not only must there be data sharing agreements that align with the legal and regulatory requirements and a technological infrastructure that enables the access to the data, there must also be a willingness to discuss the cases of particular tenants across institutions where the complexity of the case warrants it. For this discussion to be successful, both parties must be willing to share, agree on how the data should be protected and clear on the goal of sharing that data. Contextualising the use of digital protection and transmission methods in this way is a part of everyday security in the work place.

1 Practising Creative Securities Book 3 • Everyday Security Series Editor : Lizzie Coles-Kemp Editor : Peter Hall Design : Giles Lane | proboscis.org.uk Published by Royal Holloway University of London © RHUL & individual contributors 2018 ISBN : 978-1-905846-83-2 Acknowledgements: Illustrations by Makayla Lewis front cover : Illustration by Makayla Lewis Funded through EPSRC – (grant no. EP/N02561X/1) 13

Poor performance of IT systems combined with the complexity of work processes and policies make for a diffjcult working environment and often cause barriers to information sharing and information protection. Individuals often employ everyday security practices to respond to this problem by building networks of collaboration and additional information sharing between colleagues so that common goals are re-enforced and effective information sharing paths are found.

slide-24
SLIDE 24

https://www.trespass-project.eu/

Creative Security Engagement Methods

slide-25
SLIDE 25
  • C. P. R. Heath, P. A. Hall & L. Coles-Kemp: Holding on to dissensus: Participatory

interactions in security design. Strategic Design Research Journal, 11(2): 65-78 May-August 2018

slide-26
SLIDE 26
slide-27
SLIDE 27
slide-28
SLIDE 28

28

“Culture eats strategy for breakfast.” Peter Drucker “Productivity eats security for breakfast, lunch and dinner.” Angela + her research team “Trust and collaboration ... are necessary for effective cybersecurity.” Coles-Kemp, Ashenden & O’Hara: Why Should I? Cybersecurity, the Security of the State and the

Insecurity of the Citizen. Politics and Governance 2018, Volume 6, Issue 2, Pages 41–48.

slide-29
SLIDE 29

Learning from medicine Adherence (compliance): Patient follows the plan set our by professional. Less than half of patients comply. Concordance: Patient and professional agree the most appropriate treatment plan ‒ clarification of goals, joint responsibility. More than half of patients stick to agreed plan.

Compliance is not enough

slide-30
SLIDE 30

Level 5: Champions security to others and challenges breaches in their environment. Level 4: Has internalised the intent of the policy and adopts good security practises even when not specifically required to. Level 3: Understands that a policy exists and follows it by rote. Level 2: Follows security policy only when forced to do so by external controls. Level 1: Is not engaged with security in any capacity.

Levels of Security Maturity

slide-31
SLIDE 31
  • Ashenden & Lawrence (2016)
  • 3 x 3-day workshops, with 18 security professionals in

total

  • First learnt why developer acted the way they did
  • Then: skills to engage and support staff
  • conflict resolution skills from counselling
  • social market theories of exchange and influence
  • how to design behaviour change interventions

Security Dialogues

slide-32
SLIDE 32
slide-33
SLIDE 33

Security training for security experts

  • 1. Change of attitude - “we’re here to help you do what you want to

do, securely”.

  • 2. Must pass human factors 101 and economics 101.
  • 3. Learn how to communicate, negotiate, become trustworthy.
  • 4. Know how to test for effectiveness of security measures and strive

for continuous improvement.

  • 5. Use of fear tactics or user-bashing results in an automatic fail.