H517 Visualization Design, Analysis, & Evaluation Week 12: - - PowerPoint PPT Presentation

h517 visualization design analysis evaluation
SMART_READER_LITE
LIVE PREVIEW

H517 Visualization Design, Analysis, & Evaluation Week 12: - - PowerPoint PPT Presentation

H517 Visualization Design, Analysis, & Evaluation Week 12: Visualization Tasks & Evaluation Khairi Reda | redak@iu.edu School of Informa5cs & Compu5ng, IUPUI Raw user data Processed Data plots Visualization data pre- visual


slide-1
SLIDE 1

H517 Visualization Design, Analysis, & Evaluation

Khairi Reda | redak@iu.edu School of Informa5cs & Compu5ng, IUPUI

Week 12: Visualization Tasks & Evaluation

slide-2
SLIDE 2

Raw data Processed data pre- processing Data plots visual encoding Visualization interaction

user

vis designer

slide-3
SLIDE 3

Visualiza/on design process

Study Design Build

Evaluate

slide-4
SLIDE 4

Thesis 1: The purpose of visualiza5on is insight, not preDy picture Thesis 2: Visualiza5on enable people to uncover insights by helping them execute a series of tasks on a dataset

Why study users?

  • r answer questions about data
slide-5
SLIDE 5

Visualiza/on tasks

  • Understand and forecast trade paDerns
  • Explore the rela5onship between the expression of

certain genes and cancer prognosis

  • Analyze the distribu5on of sales
  • Compare the incidence of asthma in different US

states

  • Iden5fy the state with the highest number of

asthma-related hospitaliza5on

fuzzy crisp sense making (next week) mid / low-level tasks

slide-6
SLIDE 6

Sedig et al.

sense making reasoning forecasting problem solving browsing categorizing filtering identifying characterizing (distributions) brushing and linking comparing zooming

Complex cognitive activities mid-level tasks interactions events

clicking hovering details-on- demand swiping dragging

fuzzy crisp emergence

slide-7
SLIDE 7

Visualiza/on tasks

interactive visualization

sense making

see detail-

  • n demand

navigate zoom out select filter browsing looking for

  • utliers

identifying clusters characterizing trends

time

click click drag hover

vis state updates

slide-8
SLIDE 8
  • The wording of the task is oPen influenced by the user’s domain

and background

  • Despite apparent differences, many tasks can be similar
  • “Contrast the prognosis of pa5ents who were admiDed to

hospital to pa5ents receiving home care/rest” [epidemiologists studying flu]

  • “See if the results for 5ssue samples treated with LL-37 match

up with the ones without pep5de” [biologists studying immune system response]

  • Both tasks are essen5ally about “comparing averages between

two groups”

  • Task abstrac5on allow us to iden5fy common visualiza5on designs

despite apparent domain differences

Domain task abstrac/on

slide-9
SLIDE 9

{action, target} pairs

compare, trends locate, nodes in network browse, distribu5on

Based on a slide by Miriah Meyer

Domain task abstrac/on

slide-10
SLIDE 10

{action, target} pairs

Domain task abstrac/on

slide-11
SLIDE 11

Example of Domain task abstrac/on

slide-12
SLIDE 12
slide-13
SLIDE 13

Inferring Grevy’s social interac5ons

Mayank Lahiri

slide-14
SLIDE 14

Domain Tasks

  • Find communi5es in zebra society, and

iden5fy influen5al individuals who play a role in shaping the social structure

  • Understand how the social structure of

Grevy’s zebra evolve over-5me

  • Understand how Grevy’s zebra society

responds to environmental variables

slide-15
SLIDE 15

Domain Tasks

  • Find communi5es in zebra society, and

iden5fy influen5al individuals who play a role in shaping the social structure

  • Understand how the social structure of

Grevy’s zebra evolve over-5me

  • Understand how Grevy’s zebra society

responds to environmental variables

Ac/on: Find = Explore (unknown target, unknown loca5on) Target: communi5es (loosely defined: groups of zebras that hang out together)

Ac/on: Iden5fy = locate (known target, unknown loca5on) Target: influen5al individuals (typically male stallions, or lacta5ng females)

slide-16
SLIDE 16

Domain Tasks

  • Find communi5es in zebra society, and

iden5fy influen5al individuals who play a role in shaping the social structure

  • Understand how the social structure of

Grevy’s zebra evolve over-5me

  • Understand how Grevy’s zebra society

responds to environmental variables

Ac/on: Understand = Compare (mostly) Target: All communi5es over 5me

slide-17
SLIDE 17

A B C Q R X Y A B X Y C Q R A B C Q R X Y

T1 T2 T3

A B C Q R X Y A B X Y C Q R A B C Q R X Y

T1 T2 T3

Find Communi/es

slide-18
SLIDE 18

A B C Q R X Y A B X Y C Q R A B C Q R X Y

T1 T2 T3

A B C Q R X Y A B X Y C Q R A B C Q R X Y

T1 T2 T3

Find Communi/es

  • Individual are reluctant to switch community - switching cost
  • Individual are mostly seen with their community - visi/ng cost
  • Individuals are rarely absent from their community - absence cost
slide-19
SLIDE 19

A B C Q R X Y A B X Y C Q R A B C Q R X Y

T1 T2 T3

Q R X Y A B C

time

group switch

Find Communi/es

community

individuals

slide-20
SLIDE 20

Time-changing groups

individuals community

slide-21
SLIDE 21

Domain Tasks

  • Find communi5es in zebra society, and

iden5fy influen5al individuals who play a role in shaping the social structure

  • Understand how the social structure of

Grevy’s zebra evolve over-5me

  • Understand how Grevy’s zebra society

responds to environmental variables

Ac/on: Link Target: Communi5es and their geo loca5on

slide-22
SLIDE 22

Social structure (communi5es) Group movement

  • ver space and 5me

+

Social structure + geography

slide-23
SLIDE 23

Social structure + geography

slide-24
SLIDE 24
slide-25
SLIDE 25

Evaluation

slide-26
SLIDE 26

Visualiza/on design process

Study Design Build

Evaluate

slide-27
SLIDE 27

Why evaluate?

  • Evalua/on / valida/on is “about whether you have

built the right product”

  • Does it provide new insights about the data?
  • Is the visualiza5on memorable and/or engaging?
  • Does it enable users to perform their intended analysis

tasks?

  • Does the visualiza5on enable accurate percep5on of

values, distribu5ons, and/or trends in the data?

  • Is it “easy” to use? Are there any usability issues in the

interface?

cognitive effect graphical perception / UI mechanics

slide-28
SLIDE 28

Visualiza/on design process

Study Design Build

Evaluate

slide-29
SLIDE 29

Four nested levels of vis design

Munzner, 2014

slide-30
SLIDE 30

Four nested levels of vis design

Munzner, 2014

Study domain, interview users, iden/fy needs

slide-31
SLIDE 31

Four nested levels of vis design

Munzner, 2014

Iden/fy tasks and

  • data. Translate from

domain-dependent to abstract tasks and data types

slide-32
SLIDE 32

Four nested levels of vis design

Munzner, 2014

Sketch/design visual encoding and interac/on techniques

slide-33
SLIDE 33

Four nested levels of vis design

Munzner, 2014

Implement visualiza/on using code

slide-34
SLIDE 34

Threats to validity

Munzner, 2014

slide-35
SLIDE 35

Threats to validity

Munzner, 2014

You misunderstood their needs

slide-36
SLIDE 36

Threats to validity

Munzner, 2014

You’re showing them the wrong thing

slide-37
SLIDE 37

Threats to validity

Munzner, 2014

The way you show it doesn’t work

slide-38
SLIDE 38

Threats to validity

Munzner, 2014

Your code is too slow

slide-39
SLIDE 39

Guard against threats

Munzner, 2014 You misunderstood their needs

You’re showing them the wrong thing

The way you show it doesn’t work

Your code is too slow

slide-40
SLIDE 40

Evalua/ng visualiza/ons is s/ll tricky!

slide-41
SLIDE 41

Extra

slide-42
SLIDE 42

Evalua/on methods

  • Evaluate algorithm speed / memory usage
  • Controlled [lab] studies with any user
  • Qualita/ve studies
  • Insight-based evalua/on
  • Evalua/ng the data analysis process
  • Field deployment
slide-43
SLIDE 43

Controlled [lab] studies

  • Goal is to typically evaluate graphical percep/on
  • Allows for comparison between different techniques or

representa5ons

  • Generally provides accurate results, but they may not

generalize beyond lab condi5ons or tested tasks

  • Typically quan/ta/ve in nature
  • Finely-scoped tasks
  • Measure user accuracy, performance 5me, and/or

subjec5ve preference

  • Focus is typically on the analysis outcome
slide-44
SLIDE 44

Qualita/ve studies

  • Usually open-ended usage scenarios
  • Smaller number of par5cipants compared to

quan5ta5ve lab studies

  • More in-depth analysis of how par5cipants use,

interact with, and reason about the visualiza5on

  • Focus is on the analysis process
  • Analyze videos, audio, or comments from users
  • Can ask par5cipants to fill surveys, or provide

subjec5ve feedback on the visualiza5on

  • Usually involves domain experts and target audience
  • f the visualiza5on
slide-45
SLIDE 45

Insight-based evalua/on

  • The goal of visualiza5on is to generally generate new

insight

  • Evalua5on should therefore include insight-genera/on
  • Think-aloud protocols: have the users say what they are

thinking

  • Transcribe and code:
  • Observa5on
  • Hypothesis
  • Ques5on
  • Exploratory goal
slide-46
SLIDE 46

Evalua/ng the data analysis / explora/on process

  • The focus is on the visual analysis process of users, as opposed

to the outcome of the analysis

  • Want to understand the analy5c dialogue between the user and

the data

  • Want to capture and analyze mul5ple aspects:
  • Interac/ons with the visualiza/on
  • Interac5on logs, videos
  • Reasoning process
  • Think aloud protocol: have the par5cipant say what they are

thinking

  • Eye gaze behavior