HFES Public Outreach Webinar Series HFES Public Outreach Webinar - - PowerPoint PPT Presentation

hfes public outreach webinar series hfes public outreach
SMART_READER_LITE
LIVE PREVIEW

HFES Public Outreach Webinar Series HFES Public Outreach Webinar - - PowerPoint PPT Presentation

HFES Public Outreach Webinar Series HFES Public Outreach Webinar Series Organized by the Outreach Division Complimentary for all attendees Purpose: To promote the human factors/ergonomics field to members and nonmembers Four


slide-1
SLIDE 1

HFES Public Outreach Webinar Series

slide-2
SLIDE 2

HFES Public Outreach Webinar Series

  • Organized by the Outreach Division
  • Complimentary for all attendees
  • Purpose: To promote the human factors/ergonomics

field to members and nonmembers

  • Four webinars planned for 2018: health care (held on

March 19), cybersecurity, sit/stand workstations, and robotics/exoskeletons

  • Complements the HFES Webinar series for members.

See upcoming and past webinars at http://bit.ly/HFESWebinars

slide-3
SLIDE 3

HFES Webinar FAQs

1. There are no CEUs for this webinar. 2. This webinar is being recorded. HFES will post links to the recording and presentation slides on the HFES Web site within 3-5 business days. Watch your e-mail for a message containing the links. 3. Listen over your speakers or via the telephone.

If you are listening over your speakers, make sure your speaker volume is turned on in your operating system and your speakers are turned on.

4. All attendees are muted. Only the presenters can be heard. 5. At any time during the webinar, you can submit questions using the Q&A panel. The moderator will read the questions following the last presentation. 6. Trouble navigating in Zoom? Type a question into Chat. HFES staff will attempt to help. 7. HFES cannot resolve technical issues related to the webinar

  • service. If you have trouble connecting or hearing the audio, click

the “Support” link at www.zoom.us.

slide-4
SLIDE 4

About the Presenters

Gary C. Kessler, PhD, is a professor of cybersecurity and chair of the Security Studies & International Affairs Department at Embry-Riddle Aeronautical University in Daytona Beach, Florida. His research interests are in digital forensics and cybersecurity, particularly related to aviation and maritime. Gary holds degrees in mathematics, computer science, and computing technology in education, and is a Certified Information Systems Security Professional (CISSP). Robert R. Hoffman, PhD, is a recognized world leader in cognitive systems engineering and human- centered computing. He is a Fellow of the Association for Psychological Science, a Fellow of the Human Factors and Ergonomics Society, a Senior Member of the Association for the Advancement of Artificial Intelligence, a Senior Member of the Institute of Electrical and Electronics and Engineers, and a Fulbright Scholar. Since receiving his PhD from Colorado State University under Ben Clegg and Christopher Wickens in 2014, Robert Gutzwiller has accumulated extensive research experience in human systems engineering while working for the U.S. Navy. As of Fall 2018, he will be joining the faculty at Arizona State University in Mesa. Robert’s research focuses on attention modeling, human-automation interaction, and defensive cyberspace operations. Andrew R. Dattel, PhD, is an assistant professor, School of Graduate Studies, and director of the Cognitive Engineering Research in Transportation Systems Lab at Embry-Riddle Aeronautical University.

slide-5
SLIDE 5

Human Factors and the Computer Interface:

Does Easier to Use Improve Understanding?

Gary C. Kessler, Ph.D., CISSP, CCE Embry-Riddle Aeronautical University

July 2018

slide-6
SLIDE 6

Human Factors and Computers

  • Human factor concepts have been integral to

computers since the 1950s

  • Languages designed to be "human readable"
  • WIMP interface and GUIs started to appear

experimentally in the 1970s and commercially in the 1980s

(c) Gary C. Kessler, 2018 1

slide-7
SLIDE 7

COBOL and "Manager Readability"

  • COBOL is the mother of

cross-platform software

  • Designed to be "human

readable"

  • Does readable make it

understandable?

(c) Gary C. Kessler, 2018 2

slide-8
SLIDE 8

The WIMP Interface

  • Windows, icons, menus (mouse), pointer (pull-

down menu)

  • Developed at Xerox PARC for Xerox Alto (1973)
  • First seen commercially in Xerox Star (1981)
  • Popularized by Apple Macintosh (1984)
  • Improved human-computer interface (HCI) by

creating real-world analogies (e.g., desktop, files, folders, trash)

  • Designed for non-technical users

(c) Gary C. Kessler, 2018 3

slide-9
SLIDE 9

(c) Gary C. Kessler, 2018 4

slide-10
SLIDE 10

Do Users Understand Computers?

  • All major OSes today have a GUI interface
  • Apple Mac OS X, Linux X-Windows, Microsoft Windows
  • GUIs allowed more people to use computers even if

they didn't have a priori knowledge of how computers work

  • Is this why so many people have problems

understanding even the most rudimentary error message?

(c) Gary C. Kessler, 2018 5

slide-11
SLIDE 11

Leonard: Something’s wrong, I’m not getting any gas. Anybody know anything about internal combustion engines? Sheldon: Of course. Raj: Very basic. Howard: 19th-century technology. Leonard: Does anybody know how to fix an internal combustion engine? Sheldon: No. Howard: No, not a clue.

Big Bang Theory Series 04 Episode 19 – The Zarnecki Incursion (c) Gary C. Kessler, 2018 6

slide-12
SLIDE 12

PCs Made Us All Security Managers

  • Since the advent of PCs, home computers, BBSes,

home networks, Internet access to the masses, cable modem/DSL access...

  • ... mobile devices that are portable Internet

terminals and replacing desktop systems...

  • Every user has become an information security and

cyberdefense manager

  • For which they have no training, insufficient tools, and

inadequate knowledge

(c) Gary C. Kessler, 2018 7

slide-13
SLIDE 13

Security Tools and Interfaces

  • Users have been told to install anti-virus and anti-

malware software on their systems, and firewalls

  • n their networks
  • Once installed, can they be reliably configured?

(c) Gary C. Kessler, 2018 8

slide-14
SLIDE 14

Do GUIs Help With Understanding?

(c) Gary C. Kessler, 2018 9

slide-15
SLIDE 15

(c) Gary C. Kessler, 2018 10

slide-16
SLIDE 16

Does Visualization Add Meaning?

  • Do users understand visual output from security

software and security Web sites?

  • Do pretty pictures have credibility?
  • Do pictures tell a (useful) story?
  • Do pictures accurately tell the story that the viewer

thinks they tell?

  • Do they accurately tell the story that the designer thinks

they tell?

(c) Gary C. Kessler, 2018 11

slide-17
SLIDE 17

(c) Gary C. Kessler, 2018 12

slide-18
SLIDE 18

(c) Gary C. Kessler, 2018 13

CyberWar Map

National Security Archive

slide-19
SLIDE 19

Do Users Understand Emerging Technologies?

  • Our favorite emerging

buzzwords

  • Artificial Intelligence
  • Machine Learning
  • Data Science
  • Virtual Reality
  • Augmented Reality
  • Blockchain
  • Does better access to data help

us understand it or exacerbate what we don't understand?

(c) Gary C. Kessler, 2018 14 Albert Einstein

  • H. L. Mencken
slide-20
SLIDE 20

Do Users Understand the Implications?

(c) Gary C. Kessler, 2018 15

How AI will underpin cyber security in the next few years AI a threat to cyber security, warns report AI and Machine Learning in Cyber Security

What Zen Teaches About Insights

Artificial intelligence and cybersecurity: The real deal How Will Artificial Intelligence And Machine Learning Impact Cyber Security?

All headlines since 09/2017

slide-21
SLIDE 21

(c) Gary C. Kessler, 2018 16

For my birthday I got a humidifier and a de-humidifier... I put them in the same room and let them fight it out.

Steven Wright

slide-22
SLIDE 22

Speaker Contact Information

Gary C. Kessler, Ph.D., CCE, CISSP Embry-Riddle Aeronautical University Daytona Beach, Florida, USA 32174 mobile: +1 802-238-8913

  • ffice:

+1 386-226-7947 e-mail: gary.kessler@erau.edu gck@garykessler.net Skype: gary.c.kessler https://www.garykessler.net

(c) Gary C. Kessler, 2018 17

slide-23
SLIDE 23

Hu Human Factors and Cy Cybe berwork: Op Opportunities and Challenges

Robert Hoffman Institute for Human and Machine Cognition

HFES Webinar July 2018

slide-24
SLIDE 24

Hu Human Factors in Cy Cybe berwork

Seat at the Table? Milliseconds versus months Adversaries, hacks and deceptions Macrocognitive work system Designer-Centered Design is driving the Tech Work analysis and design Task analysis at a fine grain Usefulness, usability, understandability Workforce/Training Issues (lotsa novices!) Expertise Proficiency Scaling

Definition: Keeping computer networks operational and trustworthy

slide-25
SLIDE 25

Hu Human Factors in Cy Cybe berwork

Moving Target Issues Scale and tempo Adaptive adversaries Need for human-centered (or work-centered) support tools for network mapping, netflow monitoring, network analysis, and malware detection

slide-26
SLIDE 26

Hu Human Factors in Cy Cybe berwork

Dissolving Traditional Distinctions Need for Accelerated Technology Transition Specification of Requirements versus Elicitation of “Desirements” Integration of Training, Experimentation and Operations Usability, Learnability Observability

slide-27
SLIDE 27

Hu Human Factors in Cy Cybe berwork

Teaming Issues Cyberwork is teamwork Numerous Roles (intelligence support, host analysis, network analysis; Linux, Windows, etc.) Roles are highly inter-dependent Shared understanding, shared information Resource/manpower Issues Training issues

slide-28
SLIDE 28

Pr Primary Tasks

Network Maintenance/Analysis Vulnerability Analysis/Assurance Mission Support Intrusion Detection Incident Response (“Hasty” Mission) Readiness Assessment (preparing for the future) Deliberate Mission (Pre-positioning for possible incident)

slide-29
SLIDE 29

Cha Challeng nges

(H (Hint: t: Le Leave your textb tbooks at t home) Factorial manipulation of variables and controlled circumstances Unwieldy and under-informative in cyberspace experimentation. Traditional experiment designs are prohibitively resource and time intensive (incident type x tools x experience level x scenarios, etc.) Isolating the unpredictability and complexity of the world results in sterile, unrealistic conditions. Cyberspace work is necessarily subject to vagaries and unanticipated circumstances (e.g., the customer does not have a map of their own network; computers do not initialize efficiently, and so forth).

slide-30
SLIDE 30

Lo Logistics cs for Human Fact ctors Research chers

Need to achieve at least Senior Apprentice level of proficiency. Need to get an opportunity to “live in the domain.” Security issues. Partnering opportunities: Private Sector Cyber “Centers of Excellence” Cyber “Hack-a-Thons”

slide-31
SLIDE 31

Go Goodies

Trent, S., Hoffman, R.R., Merritt, D., & Smith, S.J. (in press). Modelling the cognitive work of Cyber Protection

  • Teams. Cyber Defense Review.

Hoffman, R.R. (in press). Campaign of Experimentation for Cyber Operations. Cyber Defense Review. Trent, S., Hoffman, R.R., and Lathrop, S. (2016, May). Applied research in cyberdefense operations: Difficult but critical. Cyber Defense Review [http://www.cyberdefensereview.org]. Carvalho, M., Hoffman, R.R., (with seven others). MTC2: A command and control framework for moving target defense and cyber resilience. In Proceedings of the 6th International Symposium on Resilient Control Systems (ISRCS) 2013 (pp. 175-180). New York: IEEE. Bunch, L., Bradshaw, J.M., Hoffman, R.R. and Johnson, M. (May/June 2015) Principles for Human-Centered Interaction Design, Part 2: Can Humans and Machines Think Together? IEEE Intelligent Systems, pp. 68–75. Bunch, L.,. Bradshaw, J.M. and Vignati, M. (2013). The Netflow Observatory: An Interactive 3-D Event Visualization.nIn Proceedings of VizSec 2013: Visualization for Cyber Security. New York: IEEE Scientific Visualization. [http://vizsec.org/vizsec2013/]

slide-32
SLIDE 32

Mo Modelling the Cognitive Work

  • f
  • f Cy

Cyber Prote tection Teams

Colonel Stoney Trent U.S. Army War College Robert Hoffman Institute for Human and Machine Cognition Lieutenant Colonel David Merritt U.S. Cyber Command Captain Sarah Smith U.S. Cyber Command

In Press, Cyber Defense Review

slide-33
SLIDE 33

is done via Orders Approvals Resources Interpretations Attribution Guidance Resources Approvals Host Access Network Access focuses on development of then

Information Exchange

intelligence analysts

PLANNING and LOGISTICS MONITORING and COLLECTION ANALYSIS & SYNTHESIS CLOSURE

Receive Mission Statement Planning Objectives and tasks Plan Brief up the Chain of Command Conduct Administration and Logistics Prioritized Lists Objectives Tasks Continuous Sensemaking then consists

  • f

are of Initial Analysis (Indicators of Compromise) Supported Organization’s Problem Statement Rules of Engagement refers to is based

  • n

involves Initial Meeting with Supported Organization Implement Objectives & Tasks involves Deploy/Adjust Sensors Collect Host Data Collect Network Data Vulnerability Scan Host/Network Scan Image Capture Agent Based Collection

Evaluate Risk

Network Analysis Host Analysis involves enables Threat Characterization Terrain Characterization includes Reports & Briefings Findings and Recommendations Follow-on activities leads to includes Triage, Mitigation,

  • r Other Actions

Cyber Protection Team (CPT) Supported Organization with CPT Assistance Supported Organization can be conducted by Forensic Analysis Malware Analysis Integrate & Evaluate Inform ation Collect Data supported

  • rganization

higher echelons Passive Collection Active Collection can involve Tapping the Network can involve Network Configuration Collection Identifying:

  • Network

Configurations

  • Hosts
  • Services
  • Logical/Physical Map

includes Identifying:

  • False Alarms
  • Anomalies
  • Malicious Activities
  • Potential Threats and

Vulnerabilities includes

  • Training for the

Supported Organization

  • Network Status Check
  • Check on

Implemented Mitigations

  • Changes to Best

Practices can include includes Preliminary Info About Supported Organization’s Network

slide-34
SLIDE 34

Ove Overlays of In Individual Tasks Ex Example: e: Sen ensor Dep eploymen ent

slide-35
SLIDE 35

is done via Orders Approvals Resources Interpretations Attribution Guidance Resources Approvals Host Access Network Access focuses on development of then

Information Exchange

intelligence analysts

PLANNING and LOGISTICS MONITORING and COLLECTION ANALYSIS & SYNTHESIS CLOSURE

Receive Mission Statement Planning Objectives and tasks Plan Brief up the Chain of Command Conduct Administration and Logistics Prioritized Lists Objectives Tasks Continuous Sensemaking then consists

  • f

are of Initial Analysis (Indicators of Compromise) Supported Organization’s Problem Statement Rules of Engagement refers to is based

  • n

involves Initial Meeting with Supported Organization Implement Objectives & Tasks involves Deploy/Adjust Sensors Collect Host Data Collect Network Data Vulnerability Scan Host/Network Scan Image Capture Agent Based Collection

Evaluate Risk

Network Analysis Host Analysis involves enables Threat Characterization Terrain Characterization includes Reports & Briefings Findings and Recommendations Follow-on activities leads to includes Triage, Mitigation,

  • r Other Actions

Cyber Protection Team (CPT) Supported Organization with CPT Assistance Supported Organization can be conducted by Forensic Analysis Malware Analysis Integrate & Evaluate Inform ation Collect Data supported

  • rganization

higher echelons Passive Collection Active Collection can involve Tapping the Network can involve Network Configuration Collection Identifying:

  • Network

Configurations

  • Hosts
  • Services
  • Logical/Physical Map

includes Identifying:

  • False Alarms
  • Anomalies
  • Malicious Activities
  • Potential Threats and

Vulnerabilities includes

  • Training for the

Supported Organization

  • Network Status Check
  • Check on

Implemented Mitigations

  • Changes to Best

Practices can include includes Preliminary Info About Supported Organization’s Network Evaluated in all experimental tasks

Planning and Coordination

slide-36
SLIDE 36

Us Uses of the Workflow Models

  • Identify aspects of CPT performance that can be readily observed and

measured.

  • Document which roles are involved with particular tasks.
  • “Checklist” to track performance and CPT qualifications.
  • Training, to understand at levels of detail.
  • Highlight CPT activities and functions that can only be conducted by

human decision makers--the importance of training to high levels of proficiency and expertise.

slide-37
SLIDE 37
  • Reveal work design issues, such as bottlenecks and capability gaps.
  • Workflow models provide focus for discussion of work methods, desired

tool functionalities, and best (vs. sub-optimal) practices.

  • Representation and comparison of mission differences, service

differences, down to the level of individual CPTs.

  • Identify aspects of the work that demand additional or better

technology support.

  • Track changes in CONOPS.
  • Inform the design of campaigns of experimentation.
slide-38
SLIDE 38

Th Than ank k you!

rhoffman@ihmc.us

slide-39
SLIDE 39

Human Role in Cyberspace Defense

Studying cyberspace operations is tough, constantly evolving, but doable.

  • Dr. Robert S. Gutzwiller

now at Arizona State University, Mesa, AZ

Thoughts and opinions are my own and do not reflect the official policy or position of any US Government agency or affiliates.

slide-40
SLIDE 40
slide-41
SLIDE 41

https://creativecommons.org/licenses/by-sa/2.0/ Photo Credit: Heather Katsoulis https://www.flickr.com/photos/hlkljgk/911016819

slide-42
SLIDE 42

Hollywood Cyber

slide-43
SLIDE 43

Cyber Defender Reality

  • Exploding # of

devices meant that more people were conducting monitoring of more data

slide-44
SLIDE 44

Why cyber defense analysts?

  • Tons of data, versus the targeted scouting and

infiltration of specific targets (as in attacking)

  • Time – still slow enough because of the interfaces

involved

  • Central node monitoring lots of other systems.

Consequences are huge, and safety can be involved

  • Job that is both “boring yet responsible”, but can

also be overwhelming

  • Tools are not user-centered

Any improvement in this system is a net gain.

slide-45
SLIDE 45

My limited view of the larger system

  • Information Security

Confidentiality: only the right people have access Availability: the resource can be provided Integrity: the resource can be trusted / has not been manipulated or corrupted – Cyber Security – part of INFOSEC for IT systems & networks

  • Requires monitoring, especially for disruptions, potential malicious code,

unusual traffic, incidents.

  • Multiple tasks ongoing, reporting required, teaming required, HIGH STAKES
  • Network Security – subset FOCUSED ON THE NETWORK (sometimes small,

sometimes massive)

slide-46
SLIDE 46

Cyber Security

Confidentiality: is your system open when it should be closed? Protecting and keeping secrets, only to those who should know them.

– Examples: Office of Personnel Management breach; Target breach; Sony hack – User security settings / privacy settings – are you sharing data the way you want to share data?

Availability: is your system/information reachable and functioning? provide your resources, communicate, be able to troubleshoot.

– Examples: Distributed Denial of Service – DDoS attacks; Data deletion attacks; Ransomware

Integrity: can you trust your systems/information? “software and critical data within your networks and systems are compromised with malicious or unauthorized code or bugs.” (Gault, 2015)

– Examples: Viruses and malware, unauthorized access/privileges

  • Network Security – malicious code, unusual traffic, incidents, reporting

Gault, Mike. (2015, December 20).The CIA Secret to Cybersecurity That No One Seems to Get. Retrieved from https://www.wired.com/2015/12/the-cia-secret-to- cybersecurity-that-no-one-seems-to-get/.

slide-47
SLIDE 47

My (narrower) view of cyber analysts

Any human on the loop for NETSEC, who is part of assessing data from the network, sharing information, or making decisions based on that data/activity for defense.

Horn & D’Amico 2011

Horn, Chris, and Anita D D’Amico. 2011. “Visual Analysis of Goal-Directed Network Defense Decisions.” VizSec ‘11.

slide-48
SLIDE 48

D’Amico & Whitley 2008 - The Real Work of Computer Network Defense Analysts:

slide-49
SLIDE 49

What topics from HF/E are useful to these jobs?

  • Gutzwiller, Fugate, Sawyer & Hancock (2015)

– Initial survey of topics in human factors that should improve network defense

  • Improvements made to training methodology
  • Studying the attention allocation of analysts, including the

influence of vigilance

  • Improving the human-machine teaming of any automated

systems or interfaces (e.g., intrusion detection systems)

  • Understanding what goals, tasks and information is needed

to perform the job and gain situation awareness. (And viewing this from distributed SA perspectives as well)

Gutzwiller, Robert S., Sunny Fugate, Benjamin D. Sawyer, and P. A. Hancock. 2015. “The Human Factors of Cyber Network Defense.” Proceedings of the Human Factors and Ergonomics Society Annual Meeting 59 (1): 322–26. doi:10.1177/1541931215591067.

slide-50
SLIDE 50

What has been done?

  • Programming / engineering alone has not

accomplished enough – Billions $$ spent on software, but most not developed or tested with the user (Staheli et al. 2014) – Surveys find hundreds of tools on analysts workstations… a lot go unused.

  • Many analysts mention Microsoft Excel as the best tool they have!
  • Experiments are few, but increasing (how do we know

how effective or ineffective solutions are??)

  • Low N; some methodological concerns; limits to ecological validity
  • Design is taking a serious role but needs adoption, and

professional HF/E staff (McKenna et al. 2015)

Mckenna, Sean, Diane Staheli, and MiriahMeyer. 2015. “Unlocking User-Centered Design Methods for Building Cyber Security Visualizations.” 2015 IEEE Symposium on Visualization for Cyber Security (VizSec), October. Ieee, 1–8. doi:10.1109/VIZSEC.2015.7312771.

slide-51
SLIDE 51

https://xkcd.com/1349/ https://creativecommons.org/licenses/by-nc/2.5/

Why is work in this area tough?

slide-52
SLIDE 52

Why is work in this area tough?

  • Dynamic / Evolving: policy changes,
  • rganizations change, authorities

change, tactics, threat picture, systems (IoT)

  • Limited Access to analysts,
  • bservations, prior work
  • Environment and finding resources
  • Emerging area of study
slide-53
SLIDE 53

Enormous potential, and room for improvement

  • Everything from communication and teaming to

improvements in training, data display and synthesis, and design

– CTAs or similar types of data. They are USEFUL… but also, limited. – Many studies have a very small N. – More studies suffer from methodological issues.

  • Any good approximations of cyber / network security

tasks?

– CIAT (developed by Air Force Research Lab, publicly available) – DEXTAR (Defense Exercises for Team Awareness Research) @ ASU – Forthcoming testbed developed as part of my own work

slide-54
SLIDE 54

What to do?

  • Reach out. Science-focused gov’t agencies are

anxious to apply your research to cyberspace

– Work with partners in DoD S&T enterprise will be very challenging but will open doors (my opinion)

  • Study / observe the analysts.

– Comes with the challenge of ACCESS at times, but there has been a proliferation of NETSEC in business and academia. – You can find information / IT security operations centers for cyber on your campus, and at various companies.

  • Conduct more experiments. Simple reaction time,

and accuracy may be enough!

  • The ”end” user (you and me, to Google) is critical,

but it is not the only cyber weakness.

– Phishing/privacy settings etc. are not the only HF/E games in town.

slide-55
SLIDE 55

Broad Resources*

  • NICE (NATIONAL INITIATIVE FOR

CYBERSECURITY EDUCATION) Cybersecurity Workforce Framework (NIST)

  • https://www.sans.org/
  • https://www.dhs.gov/topic/cybersecurity
  • https://www.defense.gov/News/Special-

Reports/0415_Cyber-Strategy/ *note, I did not create and do not maintain any of these

slide-56
SLIDE 56

Cognitive Task Analyses / Operator Assessments

IN ADDITION TO THE WORK OF ROBERT HOFFMAN:

  • Craig, R., Tryfonas, T., & May, J. (2014). A viable systems approach towards cyber situational
  • awareness. IEEE International Conference on Systems, Man, and Cybernetics, 1405–1411.

http://doi.org/10.1109/SMC.2014.6974112

  • D’Amico, A. D., & Kocka, M. (2005). Information assurance visualizations for specific stages of

situational awareness and intended uses: lessons learned. IEEE Workshop on Visualization for Computer Security (VizSEC)., 107–112. http://doi.org/10.1109/VIZSEC.2005.1532072

  • D’Amico, A. D., & Whitley, K. (2008). The real work of computer network defense analysts:

The analysis roles and processes that transform network data into security situation

  • awareness. In J. Goodall, G. Conti, & K. Ma (Eds.), Proceedings of the Workshop on

Visualization for Computer Security. Springer Berlin Heidelberg.

  • D’Amico, A. D., Whitley, K., Tesone, D., O’Brien, B., & Roth, E. (2005). Achieving Cyber Defense

Situational Awareness: A Cognitive Task Analysis of Information Assurance Analysts. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 49(3), 229–233. http://doi.org/10.1177/154193120504900304

  • Erbacher, R. F., Frincke, D. A., Wong, P. C., Moody, S., & Fink, G. (2010). Cognitive task analysis
  • f network analysts and managers for network situational awareness. IS&T/SPIE Electronic

Imaging, 75300H–12. http://doi.org/10.1117/12.845488

  • Goodall, J. R., Lutters, W., & Komlodi, A. (2004). I know my network: collaboration and

expertise in intrusion detection. Proceedings of the ACM Conference on Computer Supported Cooperative Work, 342–345. Retrieved from http://dl.acm.org/citation.cfm?id=1031663

  • Gutzwiller, R. S., Hunt, S. M., & Lange, D. S. (2016). A Task Analysis toward Characterizing

Cyber-Cognitive Situation Awareness (CCSA) in Cyber Defense Analysts. IEEE CogSIMA, 14–20.

slide-57
SLIDE 57

Cognitive Task Analyses / Operator Assessments (cont)

  • Horn, C., & D’Amico, A. D. (2011). Visual analysis of goal-directed network defense decisions.

Proceedings of the 8th International Symposium on …. Retrieved from http://dl.acm.org/citation.cfm?id=2016909

  • Mahoney, S., Roth, E., Steinke, K., Pfautz, J., Wu, C., & Farry, M. (2010). A cognitive task

analysis for cyber situational awareness. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 54, 279–283.

  • Paul, C. (2014). Human-centered study of a network operations center: Experience report

and lessons learned. Proceedings of the ACM Workshop on Security Information Workers, 39–

  • 42. Retrieved from http://dl.acm.org/citation.cfm?id=2663899
  • Paul, C., & Whitley, K. (2013). A taxonomy of cyber awareness questions for the user-

centered design of cyber situation awareness. In L. Marinos & I. Askoxylakis (Eds.), Lecture Notes on Computer Science: HAS/HCII 2013 (pp. 145–154). Springer-Verlag Berlin Heidelberg.

  • Pfleeger, S., & Caputo, D. (2012). Leveraging behavioral science to mitigate cyber security risk.

Computers & Security, 31(4), 597–611. Retrieved from http://www.sciencedirect.com/science/article/pii/S0167404811001659

  • Thompson, R., Rantanen, E., & Yurcik, W. (2006). Network intrusion detection cognitive task

analysis: Textual and visual tool usage and recommendations. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 50, 669–673. http://doi.org/10.1037/e577592012-011

  • Tyworth, M., Giacobe, N. A., & Mancuso, V. F. (2012). Cyber situation awareness as

distributed socio-cognitive work. Cyber Sensing - Proceedings of SPIE, 8404. http://doi.org/10.1117/12.919338

  • Tyworth, M., Giacobe, N. A., Mancuso, V. F., & Dancy, C. (2012). The distributed nature of

cyber situation awareness. IEEE International Inter-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA), 174–178. http://doi.org/10.1109/CogSIMA.2012.6188375

slide-58
SLIDE 58

Contact Information

  • Robert.Gutzwiller@asu.edu
  • Website:

https://www.researchgate.net/profile/Robert_Gutzwiller

  • Google Scholar:

https://scholar.google.com/citations?user=PPKouk4AAAAJ&hl