Intelligence, Surveillance, Reconnaissance, and Analysis Systems are - - PowerPoint PPT Presentation

intelligence surveillance reconnaissance and analysis
SMART_READER_LITE
LIVE PREVIEW

Intelligence, Surveillance, Reconnaissance, and Analysis Systems are - - PowerPoint PPT Presentation

Intelligence, Surveillance, Reconnaissance, and Analysis Systems are Mostly Failing Users (Fortunately, its a Fixable Problem) A Talk for the Laboratory for Analytic Sciences, North Carolina State University 17 August 2016 Laura A. McNamara,


slide-1
SLIDE 1

Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000. SAND NO. 2011-XXXXP

Intelligence, Surveillance, Reconnaissance, and Analysis Systems are Mostly Failing Users

(Fortunately, it’s a Fixable Problem)

A Talk for the Laboratory for Analytic Sciences, North Carolina State University 17 August 2016

Laura A. McNamara, PhD Sandia National Laboratories Albuquerque, NM 87015

slide-2
SLIDE 2

Topical Outline

  • Human Analytics at Sandia
  • What’s a Failed System?
  • Design Practices for Usable, Useful, and Adoptable

Systems

  • Visual Cognition and Human-Information

Interaction

slide-3
SLIDE 3

Human Analytics at Sandia National Laboratories

3

slide-4
SLIDE 4

Human Analytics (“Analytics for Humans”)

  • Mika Armenta, Cognitive

Psychology (intern)

  • Leif Berg, Human-Computer

Interaction

  • Karin Butler, Cognitive Psychology
  • Kerstan Cole, Human Factors
  • Kristin Divis, Cognitive Psychology
  • Michael Haass, Signal

Processing/Physics

  • John Ganter, Geography and

Human Factors

  • Laura Klein, Physics and Radar

Engineering

  • Laura Matzen, Cognitive

Neuroscience

  • Laura McNamara, Organizational

Anthropology

  • J. Daniel Morrow,

Robotics/Computer Science

  • Susan Stevens-Adams, Cognitive

Psychology

  • David Stracuzzi, Computer

Science/Mathematics

slide-5
SLIDE 5
  • Mixed-method design studies for human-information

interaction system design

  • Operational interface design and evaluation for remote

sensing systems in near-real time environments

  • Visual search performance in analytic workflows, from

imagery analysis to cybersecurity

  • Modeling domain-specific top-down visual attention to understand

expertise

  • Field research to develop interaction design models that

support human perceptual and cognitive work in high-density, high-throughput work environments

  • Novel methods for capturing and studying visual workflows:
  • Trajectory analysis applied to gaze data
  • Gaze-informed information foraging models in dynamic, user-driven

workflows

Human-System Interaction R&D…

slide-6
SLIDE 6
slide-7
SLIDE 7

..in an electronics/RF engineering organization.

Image from Sandia National Laboratories, www.sandia.gov/radar/

slide-8
SLIDE 8

8

slide-9
SLIDE 9

Some Observations on Systems that Work and Systems that Don’t

9

slide-10
SLIDE 10

We’ve spent an enormous amount of money in technology to capture, relay, store, disseminate, and analyze data. We have failed to invest equally in making these systems usable, useful, and adoptable for their intended operators, analysts, users, teams, groups, organizations, institutions….

slide-11
SLIDE 11

What makes a system a failed system?

11

slide-12
SLIDE 12

A Failed System is Present-to-Hand

12

Take for example, a hammer: it is ready-to-hand; we use it without theorizing. In fact, if we were to look at it as present-at- hand, we might easily make a

  • mistake. Only when it breaks or

something goes wrong might we see the hammer as present-at- hand, just lying there. – Martin Heidegger, Being and Time (Emphasis added)

slide-13
SLIDE 13

A failed system is one that pulls us away from the problems we want to engage, requiring us to focus for the moment on the system itself. Failed systems are selfish technologies. They require we attend to their needs before we can pursue meaningful work. They frequently interrupt us with their demands.

13

slide-14
SLIDE 14

For example, ground control stations

Image from General Atomics, http://www.ga-asi.com/ground-control-stations-library

slide-15
SLIDE 15

Ready-to-hand means…

15

slide-16
SLIDE 16

Good Technology is Read- to-Hand

  • Usable. Good technology leverages existing principles and

standards for human-system interaction, so that systems are learable, discoverable, memorable.

  • Useful. Good technology enables people to perform work

that is individually meaningful and organizationally impactful.

  • Adoptable. Good technology can be integrated into

existing infrastructure, workflows, communication practices and organizational standards.

16

slide-17
SLIDE 17

Excel is Ready-to-Hand

17

slide-18
SLIDE 18

18

Citrus: Ready-to-Hand in the right contexts

slide-19
SLIDE 19

How do we get to “Ready to Hand?”

19

slide-20
SLIDE 20

Actually, it’s pretty much a solved problem

  • Usable. Look to the Nielsen heuristics and platform-

specific design guidance (Microsoft, Apple).

  • Useful. Cognitive Work Analysis, Cognitive Task Analysis
  • Adoptable. Cognitive Work Analysis, enterprise analysis

Document Context of Work Ideate for Design Implement as Prototype Evaluate and Learn Re-Design

  • 1. Ditch waterfall.
  • 2. Engage user community as designers
  • 3. Prototype and evaluate frequently
  • 4. Wash and repeat until there’s nothing left to wring out.
slide-21
SLIDE 21

Studying Work Environments for Better Design

  • Hmm. What are they actually doing? What are they using – and why?
  • What are the key tasks I should focus on?
  • How do I represent what I’ve learned about the work environment?
slide-22
SLIDE 22

“Talking to users” isn’t enough

22

  • Cognitive Work Analysis to decompose analytic domains
  • Systematic inventory of the work environment
  • Grouping work inventory items in terms of the tasks they’re used for
  • Documenting the functional purpose of all those little tasks people do

every day

  • Documenting why those functions are so important – what do they

produce, and why does it matter?

  • How do we know the domain is meeting its intended purpose?
  • Cognitive Task Analysis to elicit individual strategies for key tasks

within that environment

  • Focus on ‘keystone tasks’ – the ones that really define what the work

environment is all about

  • Instrument these to get lots of subjective and objective data
  • CTA studies are great for informing design and can provide a nice baseline

for evaluation activities later

slide-23
SLIDE 23

Cognitive Work Analysis

A great framework for systematically decomposing a work environment.

slide-24
SLIDE 24

Task Extraction for CTA

slide-25
SLIDE 25

If it’s a solved problem, why are there so many problems?

25

It depends on the context of technology research, development, and transition.

slide-26
SLIDE 26

26

So there’s Acquisitions…

slide-27
SLIDE 27

But there’s also a lot of in-house research and development. The issues are different; I think there’s more opportunity for success, but it’s not easy. Let’s begin with a story.

27

slide-28
SLIDE 28
  • CUSTOMER: “Tell me about patterns that I

can use in my analysis.”

  • DEVELOPER (excited): We developed novel

approach for flagging, tracking irregular elements in data and it worked!

  • CUSTOMER: “Umm, those patterns aren’t

useful.”

The Bring-Me-A-Pattern Game

“We’re just glad someone likes our results. The customer wasn’t as thrilled as we would’ve liked.”

slide-29
SLIDE 29

An example from our work: Forward deployed imagery analysts have ~90 seconds to classify and report threat signatures

29

  • System renders radar

returns as a set of complementary images

  • Analysts perform “visual

inspection – plus”

  • Present imagery
  • History in imagery (normal
  • vs. abnormal activity

patterns)

  • Decide whether to act or

move on

slide-30
SLIDE 30

What is really happening here?

30

  • Sensor data analytics brings together Visual Inspection and

Information Foraging

  • Perceive and recognize
  • Iterative search, retrieve, review
  • Characterize, decide, act
  • Multi-source context key in figuring out Where & When
  • Foundational semantics are spatial and temporal
  • Gestalt grouping of abstract imagery features into recognizable objects – across

multiple images within/across different image products

  • Data-to-Information Value Stream
  • From sensor to analyst
  • Sensor operator decisions can enable and constrain longer-term analysis

processes and products

  • Understanding the context of collection is absolutely key in designing algorithms

for pattern detection in large dataset

slide-31
SLIDE 31

31

slide-32
SLIDE 32

If we can get systems people will use, we can do a lot of really neat human- information interaction research.

32

slide-33
SLIDE 33

33 Department 1465 Review

slide-34
SLIDE 34

Distinguishing Strategies in Visual Foraging Trackble applied to gaze data

slide-35
SLIDE 35

Integrating gaze (eye tracking) events for an even richer dataset

Stimulus Creation & Data Collection

  • Imagery/Ground truth
  • Eye tracking
  • Software events

Data Wrangling

  • Nuisance
  • Quality control
  • Preprocessing

Feature Extraction

  • Image content
  • Visual salience
  • Trajectory descriptors

Time Series Data Regression Analysis

  • Training impacts
  • Experience impacts
  • Relative difficulty

N-Gram Modeling

  • On viewed content
  • On scan patterns

Latent Dirichlet Allocation

  • Image content (ground truth)

as topics

  • Generate scan pattern

Graphical Models

  • Cluster data
  • Structure model by
  • bservation sequence or by

image properties Feature Relevance

slide-36
SLIDE 36

It’s a lot of work, but also quite fun.

Stimulus Creation & Data Collection

  • Imagery/Ground truth
  • Eye tracking
  • Software events

Data Wrangling

  • Nuisance
  • Quality control
  • Preprocessing

Feature Extraction

  • Image content
  • Visual salience
  • Trajectory descriptors

Time Series Data Regression Analysis

  • Training impacts
  • Experience impacts
  • Relative difficulty

N-Gram Modeling

  • On viewed content
  • On scan patterns

Latent Dirichlet Allocation

  • Image content (ground truth)

as topics

  • Generate scan pattern

Graphical Models

  • Cluster data
  • Structure model by
  • bservation sequence or by

image properties Feature Relevance

slide-37
SLIDE 37

Concluding Thoughts

  • If you are serious about creating tools that people can put to use, you’ll

have to invest time and effort in documenting the context of use.

  • Don’t rely on “talking to users” or “watching users” as a basis for a design.

There are systematic, well-documented ways for engaging a context, and you can learn to apply them in your research (or team with someone who already does this)

  • Gathering data on your user community doesn’t stop with a “finished”
  • technology. Good design is iterative; it’s about forming a creative

partnership of equals rather than throwing an awesome algorithm “over the fence”

  • If we can create things will actually use, and then gather data on how we

are using them, we can establish a virtuous cycle of technology improvement.

  • We are ALWAYS happy to help anyone who wants to make their tools

better.

Laura McNamara, Sandia National Laboratories, lamcnam@sandia.gov