Reflecting on Visualization for Cyber Security Carrie Gates - - PowerPoint PPT Presentation

reflecting on visualization for cyber security
SMART_READER_LITE
LIVE PREVIEW

Reflecting on Visualization for Cyber Security Carrie Gates - - PowerPoint PPT Presentation

Reflecting on Visualization for Cyber Security Carrie Gates carrie.gates@ca.com Sophie Engle sjengle@cs.usfca.edu 2 INTRODUCTION Reflecting on Visualization for Cyber Security Sophie J. Engle sjengle@cs.usfca.edu Evaluating


slide-1
SLIDE 1

Reflecting on Visualization for Cyber Security

Carrie Gates • carrie.gates@ca.com Sophie Engle • sjengle@cs.usfca.edu

slide-2
SLIDE 2

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

INTRODUCTION

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

2

slide-3
SLIDE 3

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Introduction

  • Short position paper
  • Result of brainstorming session

– Identify future research directions – Suggest approaches for future research

  • Designed to encourage discussion

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

3

slide-4
SLIDE 4

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Brainstorming

  • Why has visualization not been more

successful in cyber security?

  • How can visualization be used effectively

for cyber security?

  • How do you evaluate visualization for

cyber security?

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

4

slide-5
SLIDE 5

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Motivation

  • Success is important

– Extensive resources required to develop,

evaluate, and iterate visualizations

  • Success is evasive

– Avoid common pitfalls – Choose a suitable visualization goal

  • Success is fuzzy

– Accuracy and efficiency hard to evaluate

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

5

slide-6
SLIDE 6

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

COMMON PITFALLS

What Should We Avoid?

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

6

slide-7
SLIDE 7

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

XKCD: Convincing

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

7

http://xkcd.com/833/

slide-8
SLIDE 8

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Using visualization for the wrong reasons.

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

8

slide-9
SLIDE 9

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Using visualization for the sake of visualization.

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

9

slide-10
SLIDE 10

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Visualization Goals

  • Statistical Graphics

– Accuracy, Informative

  • Informative Art/Visualization Art

– Aesthetics

  • Infographics

– Aesthetics, Informative

  • Information Visualization

– Accuracy, Informative, Aesthetics

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

10

slide-11
SLIDE 11

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Pretty Pictures ≠ InfoVis

  • Avoid by specifying a question or goal first
  • Do NOT get distracted by fancy encodings
  • Do NOT get distracted by novel techniques
  • Start with existing and well-tested techniques
  • Try state-of-the-art or novel approaches when
  • ther techniques fail to perform well

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

11

slide-12
SLIDE 12

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Visualization is not a magic bullet.

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

12

slide-13
SLIDE 13

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Goldilocks Principle

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

13

http://w8r.com/the-colorful-story-book/the-three-bears

slide-14
SLIDE 14

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Goldilocks Principle

  • Too Simple Problems

– Do not need visualization

  • Too Complex Problems

– Rename "too undefined" – Part of the solution, but

not THE solution

  • Problem must be "just right"

– Need good data and good

problems

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

14

http://w8r.com/the-colorful-story-book/the-three-bears

slide-15
SLIDE 15

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

USE CASES

What Could We Try?

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

15

slide-16
SLIDE 16

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Use Cases

  • Visualization for a Specific Goal
  • Visualization for Exploration
  • Visualization as a Stepping Stone
  • Visualization for Evaluation
  • Visualization as Evidence

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

16

slide-17
SLIDE 17

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Visualization for a Specific Goal

  • Must be accurate and informative
  • Must support data analysis

– Anomaly detection flags event as anomalous,

but unknown whether is malicious

– Use visualization to help resolve this grey area

  • n case-by-case basis
  • All other cases are subcases of this one

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

17

slide-18
SLIDE 18

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Visualization for Exploration

  • Sometimes not having a well-formed question

is the problem!

  • Use visualization to explore data, provide

context, and help form questions

  • More difficult to evaluate, may lose usefulness

after question is formed

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

18

slide-19
SLIDE 19

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Visualization as a Stepping Stone

  • Use visualization as a stepping stone in analysis

– Guide root cause analysis in a complex

environment

  • Neither the starting point or ending point

– Does not provide the question – Does not provide the answer

  • Provides context, more exploratory in nature

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

19

slide-20
SLIDE 20

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Visualization for Evaluation

  • Aid evaluation of security mechanisms

– Mechanisms must support complex policies – Multiple mechanisms protecting resources – Difficult to configure and maintain

  • Does not replace mechanisms, only improves

usage of those mechanisms

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

20

slide-21
SLIDE 21

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Visualization as Evidence

  • Justification for response to cyber threat

– A security analyst may need to justify changes

to infrastructure to decision makers

  • Illustrate evidence of an attack

– Presenting forensic evidence to a jury

  • More focused on story-telling than analysis

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

21

slide-22
SLIDE 22

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

EVALUATION

How Do We Know What Works?

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

22

slide-23
SLIDE 23

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Evaluation

  • Evaluation focused on visualization

– Focus in visualization community (85%) – Focus on pushing boundaries of visualization

  • Evaluation focused on data analysis process

– Focus on application of visualization – Less research on this type of evaluation – Important for cyber security visualization

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

23

slide-24
SLIDE 24

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

User Performance Evaluation

  • Large study

– Cannot require expert knowledge – Simple and measurable tasks – Possible for realistic cyber security tasks?

  • Small study

– Require domain experts – More complex but still measurable tasks – Applicability of results to other environments?

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

24

slide-25
SLIDE 25

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

User Experience Evaluation

  • Recruitment still an issue

– Release visualization for anyone to use – Track adoption rate – Solicit feedback from users

  • Usually requires expert users

– Must use tool in environment for specific task – Usage often needs to be measured over time

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

25

slide-26
SLIDE 26

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Process Evaluation

  • Focused less on techniques, more on tools

– Techniques broadly applicable – Tools must be evaluated within context used

  • Focus on understanding environment

– Independent of any visualization tools

  • Focus on visual data analysis process

– Dependent of visualization tools in use

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

26

slide-27
SLIDE 27

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Environment Evaluation

  • Perform evaluation as a precursor to building

visualization tool

– Help identify problem and visualization goal

  • Evaluate how existing tools are used

– Identify how to improve or supplement tools

  • Data collected via field or lab observation,

surveys, or interviews

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

27

slide-28
SLIDE 28

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Analysis Process Evaluation

  • How well tool supports data exploration and

knowledge discovery

  • How well tool allows analyst to generate

hypotheses and make decisions

  • Often conducted via case studies

– Target set of actual users – Realistic needs – Realistic evaluation

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

28

slide-29
SLIDE 29

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

CONCLUSION

Reflecting on Visualization for Cyber Security

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

29

slide-30
SLIDE 30

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

XKCD: The Important Field

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

30

http://xkcd.com/970/

slide-31
SLIDE 31

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

Conclusion

  • Short, position paper reflecting on cyber

security visualization

  • Brainstorming on what to avoid, what to try,

and how to evaluate future research

  • Highlights importance of the cyber security

problem and visualization goal

  • Designed to be part of a discussion

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

31

slide-32
SLIDE 32

Sophie J. Engle • sjengle@cs.usfca.edu Department of Computer Science

THE END

Questions, Comments, or Discussion?

Evaluating Cybersecurity Visualizations • Seattle, Washington • June 4, 2013 Reflecting on Visualization for Cyber Security

32