Cognition in Visual Processing 707.031: Evaluation Methodology - - PowerPoint PPT Presentation

cognition in visual processing
SMART_READER_LITE
LIVE PREVIEW

Cognition in Visual Processing 707.031: Evaluation Methodology - - PowerPoint PPT Presentation

Cognition in Visual Processing 707.031: Evaluation Methodology Winter 2015/16 Eduardo Veas Research Projects @ KTI email Eduardo Connected world build connected coffee machine build sensing and intelligence into appliances


slide-1
SLIDE 1

Cognition in Visual Processing

707.031: Evaluation Methodology Winter 2015/16

Eduardo Veas

slide-2
SLIDE 2

Research Projects @ KTI email Eduardo

  • Connected world
  • build connected coffee machine
  • build sensing and intelligence into appliances
  • Augmented Data
  • how can we augment the real world with data?
  • investigate different display devices
  • investigate different visual techniques
  • Augmented Knowledge Spaces
  • Use space to organize and interact with technology
  • Use natural mobility to interact with augmentations

2

slide-3
SLIDE 3

Project Topics

3

  • Glove Study: Granit, Arbenore ,Santokh, Millot
  • AR Signs Study: Eduardo, Santokh, Rene, Millot
  • Collection Study: Cecilia,
  • VisRec Study: Belgin, Millot, Santokh
  • AR navigation study:
slide-4
SLIDE 4

Model Human Processor

4

Source: Card et al 1983

slide-5
SLIDE 5

5

Perception vs. Cognition


Perception

  • Eye, optical nerve,

visual cortex

  • Basic perception
  • First processing
  • (edges, planes)
  • Not conscious
  • Reflexes

Cognition

  • Recognizing objects
  • Relations between
  • bjects
  • Conclusion drawing
  • Problem solving
  • Learning
slide-6
SLIDE 6

6

Shannon’s Information Theorem for Vis

slide-7
SLIDE 7

Visual Hypotheses

  • Grouping

– What perceptual factors help me group or categorize visual stimuli?

7

  • Separability vs

Integrality

slide-8
SLIDE 8

Gestalt Laws

  • Understand Pattern Perception
  • Westheimer, Koffka, Kohler 1912 Gestalt School of Psychology
  • Reasons they gave wrong,
  • OBSERVATIONS CORRECT

8

  • Proximity
  • Similarity
  • Connectedness
  • Continuity
  • Symmetry
  • Closure
  • Relative Size
  • Common Fate
slide-9
SLIDE 9

Gestalt Laws

The whole is greater than the sum of the parts!

9

slide-10
SLIDE 10

10

Proximity

slide-11
SLIDE 11

11

Proximity Example

slide-12
SLIDE 12

12

Similarity

slide-13
SLIDE 13

13

Connectedness

slide-14
SLIDE 14

14

Continuity

slide-15
SLIDE 15

15

Symmetry

slide-16
SLIDE 16

16

Closure

slide-17
SLIDE 17

17

Closure Example

slide-18
SLIDE 18

18

Relative Size

Same size

slide-19
SLIDE 19

19

Figure & Ground

slide-20
SLIDE 20

20

Common Fate

slide-21
SLIDE 21

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Data Attributes (scales, levels)

  • Qualitative
  • Nominal / Categorical
  • Categories with no ordering (gender, nationality, blood type, etc).
  • Ordinal
  • Ordered categories (grades, ranks, etc).
  • Quantitative
  • Interval
  • distance between entities matters, but not ratios, arbitrary 0
  • temperature in Celsius or Fahrenheit, time, etc.
  • Ratio
  • Meaningful 0 (weight, age, length, temperature in Kelvin, etc).
  • Absolute
  • Count (number of students, number of lines of code etc).
slide-22
SLIDE 22

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

VISUAL VARIABLES

Visual Variables:”Based on Bertin‘s Visual Variables, 1967, S. Carpendale's discussion, and Munzner categorization of marks and channels.

slide-23
SLIDE 23

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Representation

  • How data is understood depends on it‘s

representation

  • Different representations have different benefits or

deficiencies

  • Example: number thirty-four

– Decimal: 34 – Roman: XXXIV – 100010

  • Degree of accessibility varies abstraction required
slide-24
SLIDE 24

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Visual Encoding

Propose a combination of visual variables (marks and channels) to describe data dimensions

slide-25
SLIDE 25

Visual Variables

25

slide-26
SLIDE 26

26

Characteristics of Visual Variables

  • Selective

– Is a mark distinct from other marks?

  • Associative

– Marks with associative visual variables can be perceived as a group.

  • Quantitative

– Relationship between two marks can be seen as numerical.

  • Order

– A change in an ordered visual variable will be perceived as more or less.

  • Length

– The number of changes that can be used (often perception influenced, jnd)

slide-27
SLIDE 27

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Position

slide-28
SLIDE 28

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Size

slide-29
SLIDE 29

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Shape

slide-30
SLIDE 30

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Select a shape?

slide-31
SLIDE 31

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Value

slide-32
SLIDE 32

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Select Value!

slide-33
SLIDE 33

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Color

slide-34
SLIDE 34

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Select Color!

slide-35
SLIDE 35

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Orientation

slide-36
SLIDE 36

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

Effectiveness of visual variables (Mackinlay 88)

[Schumann 2000 nach Macinlay 1986]

Decreasing

slide-37
SLIDE 37

Perception


Depth perception

37

slide-38
SLIDE 38

38

Today‘s Agenda

  • Visual Variables

– Data Scales – Data Representation – Bertin‘s Visual Variables

  • Depth Perception

– Occulomotor cues – Pictorial cues

slide-39
SLIDE 39

39

Visual Perception Theories

  • Gregory (1970)
  • Top-down
  • Perception is

constructive

  • Perceptual hypotheses
  • 90% sensory

information is lost in the visual pathway

  • Gibson (1966)
  • Bottom-up
  • Direct perception
  • Sensory information

analyzed in a pipeline

  • No need for previous

knowledge

slide-40
SLIDE 40

Bottom up Perception: components I

  • Light and environment:
  • ptic flow.

40

slide-41
SLIDE 41

Bottom up perception components II

  • Invariants and texture

41

slide-42
SLIDE 42

Bottom up Perception: components III

  • affordances

42

  • Optical array
  • Relative brightness
  • Texture gradient
  • Relative size
  • Superimposition
  • Height in the visual

field.

slide-43
SLIDE 43

43

Depth Perception: Cues

  • Occulomotor

– Convergence – Accomodation

slide-44
SLIDE 44

44

Depth Perception: Cues

  • Binocular

– Stereopsis/binocular disparity

slide-45
SLIDE 45

45

Depth Perception: Cues

  • Motion Based (video)

– Motion parallax – Kinetic depth field

b

a

c

slide-46
SLIDE 46

46

Structure from motion

  • Kinetic Depth Effect
  • Assumption of rigidity allows us to

assume shape as objects move/rotate

a

c

slide-47
SLIDE 47

47

PICTORIAL DEPTH CUES

slide-48
SLIDE 48

48

Perspective cues

  • Parallel lines converge
  • Distant objects appear smaller
  • Textured Elements become smaller with

distance

slide-49
SLIDE 49

49

Perspective and vanishing points

slide-50
SLIDE 50

50

Occlusion

  • The strongest depth cue
  • Stereopsis occlusion different for each

eye

slide-51
SLIDE 51

51

Depth of focus

  • Strong Depth Cue
  • Must be coupled with user input (e.g.

point of fixation)

slide-52
SLIDE 52

52

Shadows

  • Important cue for height of an object

above a plane

  • An indirect depth cue
slide-53
SLIDE 53

53

Atmospheric depth

  • Reduction in contrast of distant objects
  • Exaggerated in 3D displays using what

is called proximity luminance covariance.

slide-54
SLIDE 54

54

Depth Perception: Pictorial Cues

slide-55
SLIDE 55

55

Depth Perception: Cues Summary

slide-56
SLIDE 56

56

Illustration techniques

Emphasizing height, width or volume

slide-57
SLIDE 57

57

Visualization Tasks

  • Judging 3D surfaces
  • Relative Position in 3D space
  • Finding 3D patterns of points
  • Judging relative movement of particles
  • Judging self movement
  • Feeling a “sense of presence”
slide-58
SLIDE 58

58

Depth Perception: Volume rendering

slide-59
SLIDE 59

Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality

59

Livingston, Ai, Swan, Smallman

slide-60
SLIDE 60

Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality

Goals

  • test depth perception

indoors and outdoors

  • test AR based linear

perspective

60

slide-61
SLIDE 61

Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality

Task

  • eight real referents

identified by color

  • colored target appeared

at a random depth

  • participant moved target

until it matches corresponding referent

61

slide-62
SLIDE 62

Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality

Independent Variables

  • Environment {indoor,
  • utdoor}
  • Tramlines {on, off}
  • Grid points {on, off}
  • Distance {3.83, 9.66…}
  • Repetition {1,2,3,4,5}

62

Dependent Variables

  • Distance off
  • Time to complete
  • Nasa TLX
  • Subjective questions
slide-63
SLIDE 63

Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality

Hypotheses

  • outdoor environment results in greater error
  • tramlines and gridpoints increase precision
  • increasing errors and decreasing precision with distance

63

slide-64
SLIDE 64

Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality

Results: Effect of environment on normalized error.

64

slide-65
SLIDE 65

Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality

Results: Effect of repetition on normalized error.

65

slide-66
SLIDE 66

Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality

  • No effect of Grid or Tramlines
  • No difference in NASA TLX measures.

66

slide-67
SLIDE 67

SITUATION AWARENESS AND MENTAL LOAD

67

slide-68
SLIDE 68

Situational awareness

68

SITUATION AWARENESS Projection of future status LEVEL 3 Comprehension

  • f current

situation LEVEL 2 Perception of elements in the environment LEVEL 1

slide-69
SLIDE 69

Situation awareness

  • Often reduced to spatial

awareness

  • Where are objects in the

environment?

  • Where are they moving?
  • Where will they be if

everything continues as is?

  • But also with decision

making

69

slide-70
SLIDE 70
  • Strict form of Cognitive Walkthrough
  • Tasks-> Goals are analysed and information for

decision is extracted/structured

70

slide-71
SLIDE 71

Situation(al) awareness

71

slide-72
SLIDE 72

Human performance

  • Situation Awareness

– Perception of current status of the environment. Projection of future status. (mostly real-time interactive systems)

  • Workload (cognitive load)

– energetic construct representing the supply and demand of attentional and processing resources – allocated to maintain situation awareness

72

slide-73
SLIDE 73

Mental load

  • Also cognitive load, workload
  • It assumes a limited working memory connected

to an unlimited long-term memory. Working memory retrieves, processes and integrates data.

73

slide-74
SLIDE 74

Additional Efficiency Measures: Mental load

…to evaluate effort associated with the use of an interface. (used in HCI mostly)

74

slide-75
SLIDE 75

Additional Efficiency Measures: Mental load

  • Subjective Measurements

– Primary task performance: – Secondary task performance – Subjective measurements:

  • NASA (R)TLX, USAF SWAT
  • Physiological measurements:
  • Heart rate, pupil dilation, eye movement, brain activity
  • [Vidulich and Tsang, Mental Workload and Situation Awareness. In

Handbook of Human Factors and Ergonomics]

75

slide-76
SLIDE 76

NASA TLX

  • Self assessment of performance
  • Must be delivered immediately after finishing

task

  • 6 dimensions (mental demand, temporal

demand, physical demand, performance, effort, frustration)

  • TLX: workload is the weighted sum of scores
  • R-TLX: workload is the average of scores

76

slide-77
SLIDE 77

Institut für Maschinelles Sehen und Darstellen ‹#›

Eduardo Veas Evaluation-Validation

BCI

slide-78
SLIDE 78

Feature Integration Theory III: Mental load

78

slide-79
SLIDE 79

BCI

  • takes a biosignal

measured from a person

  • predicts some aspect of

person cognitive state

  • in real-time / on a single

trial basis

79

slide-80
SLIDE 80

BCI Types

  • Active: derives its outputs from brain activity that

the user directly controls, independent of external events, to control sth.

  • Reactive: derives output from brain activity

arising in reaction to external stimuli, for controlling sth.

  • Passive: derives output from arbitrary brain

activity without the purpose of voluntary control

80

slide-81
SLIDE 81

BCI: Applications

  • Severe Dissability:

tetraplegia, locked-in syndrome

  • speller programs
  • domotic interfaces

81

slide-82
SLIDE 82

BCI: Applications II

  • Operator Monitoring:

intent detection

  • Brake Intent
  • Lane change intent

82

slide-83
SLIDE 83

BCI Applications III: Workload

  • Monitoring workload,

alertness, fatigue in highly demanding tasks (air traffic controller, pilots)

  • Passive BCI

83

slide-84
SLIDE 84

BCI in Evaluations [Phase I]:

84

  • 1. Learn about BCI knowledge, find relevant previous

BCI studies

  • 2. Research design: establish research question and

analysis metrics

  • 3. Pilot: [go to next page] and return to 1 if needed
slide-85
SLIDE 85

BCI in Evaluations [Phase II]

85

  • Prepare your data
  • Test hardware
  • Prepare room and setup
  • Pilot [return to draft?]
  • Execute
  • Calibration
  • Training
  • Calibration
  • Testing
  • Analyze
slide-86
SLIDE 86

Additional Efficiency Measures: Mental load

A user study of visualization effectiveness using EEG and cognitive load

86

Resting Period Blank 2 sec stimulus Blank 2 sec stimulus Repeat 98 times End rest marker Start trial marker Start trial marker EEG @ 128Hz baseline Intertrial baseline Trial S-Transform S-Transform Alpha (7.5-12.5 hz) Theta (4 -7.5 hz)

slide-87
SLIDE 87

Measuring Mental Load

  • EEG based:

– Difficult to come up with study – Difficult to isolate causes of cognitive load – Objectively evaluate interpretation of visualizations?

  • Subjective approach

– Include conscious reasoning in the loop – Depends on subjective opinion

87

slide-88
SLIDE 88

Readings

  • Tobii eye tracking whitepaper
  • Nasa TLX
  • Brain–Computer Interfaces: A Gentle Introduction
  • [Endsley] Direct measurement of situation awareness:

validity and use of SAGAT

  • [Borghini etal] Measuring neurophysiological signals in

aircraft pilots and car drivers for the assessment of mental workload, fatigue and drowsiness

  • [Veas etal] Directing attention and influencing memory

with saliency modulation

88

slide-89
SLIDE 89

89

Recommended reads

  • Brain–Computer Interfaces: A Gentle Introduction
  • [Endsley] Direct measurement of situation awareness: validity and use of

SAGAT

  • [Borghini etal] Measuring neurophysiological signals in aircraft pilots and car

drivers for the assessment of mental workload, fatigue and drowsiness

  • NASA TLX
  • Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality

[Livingston et al]

  • Depth perception in media design: from sensory psychology cues to

interactive tools [Oliver2001]

  • Pursuit of x-ray vision for augmented reality [Livingston,Dey,Sandor,Thomas]
  • Binocular disparity and the perception of depth [Qian1997]
  • Perceptually based depth-ordering enhancement for direct volume rendering

[Zheng et al]

slide-90
SLIDE 90

Research Projects @ KTI

  • Connected world
  • build connected coffee machine
  • build sensing and intelligence into appliances
  • Augmented Data
  • how can we augment the real world with data?
  • investigate different display devices
  • investigate different visual techniques
  • Augmented Knowledge Spaces
  • Use space to organize and interact with technology
  • Use natural mobility to interact with augmentations

90