CSE 440: Introduction to HCI User Interface Design, Prototyping, and - - PowerPoint PPT Presentation

cse 440 introduction to hci
SMART_READER_LITE
LIVE PREVIEW

CSE 440: Introduction to HCI User Interface Design, Prototyping, and - - PowerPoint PPT Presentation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation Lecture 11: James Fogarty Inspection Kailey Chan Dhruv Jain Nigini Oliveira Tuesday / Thursday Chris Seeds 12:00 to 1:20 Jihoon Suh Project Status Looking


slide-1
SLIDE 1

CSE 440: Introduction to HCI

User Interface Design, Prototyping, and Evaluation

James Fogarty Kailey Chan Dhruv Jain Nigini Oliveira Chris Seeds Jihoon Suh Lecture 11: Inspection Tuesday / Thursday 12:00 to 1:20

slide-2
SLIDE 2

Project Status

Looking Forward

Team Peer Feedback was Due Saturday 11/4 3b: Heuristic Evaluation Due Wednesday 11/8 3c: Usability Testing Check-In Due Friday 11/10 3d: Usability Testing Review Due Monday 11/13 3e: Digital Mockup Due Thursday 11/16

Other Assignments

Reading 4 Due Saturday 11/11, Sooner is Better Reading 5 Can Be Done Anytime, Sooner is Better

slide-3
SLIDE 3

Objectives

Be able to: Describe why we use inspection-based methods Given Nielsen's heuristics, be able to: explain what each of them means apply them to identify usability failures in an interface Describe an effective heuristic evaluation process Explain why the typical recommendation for heuristic evaluation is 3 to 5 independent evaluators

slide-4
SLIDE 4

Inspection-Based Methods

We have cut prototyping to its minimum

Sketches, storyboards, paper prototypes Rapid exploration of potential ideas

But we need evaluation to guide improvement

Can become relatively slow and expensive Study participants can be scarce Can waste participants on obvious problems

slide-5
SLIDE 5

Inspection-Based Methods

Simulate study participants

Instead of actual participants, use inspection to quickly and cheaply identify likely problems

Inspection methods are rational, not empirical Today we cover two complementary methods

Heuristic Evaluation Cognitive Walkthrough

slide-6
SLIDE 6

Heuristic Evaluation

Developed by Jakob Nielsen

Helps find usability problems in a design Not a method for “coming up with” a design

Small set of evaluators examine interface

Three to five evaluators Independently check compliance with principles Different evaluators will find different problems Evaluators only communicate afterwards

Can perform on working interfaces or sketches

slide-7
SLIDE 7

Nielsen’s 10 Heuristics

Too few unhelpful, too many overwhelming

“Be Good” versus thousands of detailed rules

Nielsen seeks to create a small set

Collects 249 usability problems Collects 101 usability heuristics Rates how well heuristics explain problems Factor analysis to identify key heuristics

Nielsen, 1994

slide-8
SLIDE 8

Nielsen’s 10 Heuristics

Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help recognize, diagnose, and recover from errors Help and documentation

Nielsen, 1994

slide-9
SLIDE 9
  • 1. Visibility

Visibility of system status

The system should always keep people informed about what is going on, through appropriate feedback within reasonable time.

slide-10
SLIDE 10
  • 1. Visibility

Visibility of system status

The system should always keep people informed about what is going on, through appropriate feedback within reasonable time.

Refers to both visibility of system status and providing appropriate feedback

Anytime a person is wondering what state the system is in, or the result of some action, this is a visibility violation.

slide-11
SLIDE 11
  • 2. Real World Match

Match between system and the real world

The system should speak a person’s language, with words, phrases and concepts familiar to the person, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order.

slide-12
SLIDE 12
  • 2. Real World Match

Match between system and the real world

The system should speak a person’s language, with words, phrases and concepts familiar to the person, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order.

Refers to word and language choice, mental model, metaphor, mapping, and sequencing

slide-13
SLIDE 13
  • 3. Control and Freedom

User control and freedom

People often choose system functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.

slide-14
SLIDE 14
  • 3. User in Control

User control and freedom

People often choose system functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.

Not just for navigation exits, but for getting out of any situation or state.

slide-15
SLIDE 15
  • 4. Consistency

Consistency and standards

People should not have to wonder whether different words, situations,

  • r actions mean the same thing.

Follow platform conventions.

slide-16
SLIDE 16
  • 4. Consistency

Consistency and standards

People should not have to wonder whether different words, situations,

  • r actions mean the same thing.

Follow platform conventions.

Internal consistency is consistency throughout the same product. External consistency is consistency with other products in its class.

slide-17
SLIDE 17
  • 5. Error Prevention

Error prevention

Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present people with a confirmation

  • ption before they commit to the action.
slide-18
SLIDE 18
  • 5. Error Prevention

Error prevention

Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present people with a confirmation

  • ption before they commit to the action.

Try to commit errors and see how they are

  • handled. Could they have been prevented?
slide-19
SLIDE 19
  • 6. Recognition not Recall

Recognition rather than recall

Minimize a person’s memory load by making objects, actions, and options visible. A person should not have to remember information from one part of the dialogue to

  • another. Instructions for use of the system

should be visible or easily retrievable whenever appropriate.

slide-20
SLIDE 20
  • 6. Recognition not Recall

Recognition rather than recall

Minimize a person’s memory load by making objects, actions, and options visible. A person should not have to remember information from one part of the dialogue to

  • another. Instructions for use of the system

should be visible or easily retrievable whenever appropriate.

People should never carry a memory load

slide-21
SLIDE 21
  • 6. Recognition not Recall

Addresses visibility of features and information

where to find things

Visibility addresses system status and feedback

what is going on

Problems with affordances may go here

hidden affordance: remember where to act false affordance: remember it is a fake

slide-22
SLIDE 22
  • 7. Flexibility and Efficiency

Flexibility and efficiency of use

Accelerators, while unseen by novices, may often speed up the interaction for experts such that the system can cater to both inexperienced and experienced use. Allow people to tailor frequent actions.

slide-23
SLIDE 23
  • 7. Flexibility and Efficiency

Flexibility and efficiency of use

Accelerators, while unseen by novices, may often speed up the interaction for experts such that the system can cater to both inexperienced and experienced use. Allow people to tailor frequent actions.

Concerns anywhere users have repetitive actions that must be done manually. Also concerns allowing multiple ways to do things.

slide-24
SLIDE 24
  • 8. Aesthetic Design

Aesthetic and minimalist design

Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

slide-25
SLIDE 25
  • 8. Aesthetic Design

Aesthetic and minimalist design

Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

Not just about “ugliness”. About clutter, overload of visual field, visual noise, distracting animations.

slide-26
SLIDE 26
  • 9. Error Recovery

Help users recognize, diagnose, and recover from errors

Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

slide-27
SLIDE 27
  • 9. Error Recovery

Help users recognize, diagnose, and recover from errors

Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

Error prevention is about preventing errors before they occur. This is about after they occur.

slide-28
SLIDE 28
  • 10. Help

Help and documentation

Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on a person’s task, list concrete steps to be carried out, and not be too large.

slide-29
SLIDE 29
  • 10. Help

Help and documentation

Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on a person’s task, list concrete steps to be carried out, and not be too large.

This does not mean that a person must be able to ask for help on every single item.

slide-30
SLIDE 30

Heuristic Evaluation Process

Evaluators go through interface several times

Inspect various dialogue elements Compare with list of usability principles

Usability principles

Nielsen’s heuristics Supplementary list of category-specific heuristics (competitive analysis or testing existing products)

Use violations to redesign/fix problems

slide-31
SLIDE 31

Examples

Can’t copy info from one window to another

violates “Minimize memory load” (H6) fix: allow copying

Typography uses different fonts in 3 dialog boxes

violates “Consistency and standards” (H4)

slows users down probably wouldn’t be found by usability testing

fix: pick a single format for entire interface

slide-32
SLIDE 32

Heuristics

slide-33
SLIDE 33

Heuristics

slide-34
SLIDE 34

Heuristics

Searching database for matches

slide-35
SLIDE 35

Heuristics

Visibility of system status

pay attention to response time

0.1 sec: no special indicators needed (why?) 1.0 sec: person tends to lose track of data 10 sec: maximum duration if person to stay focused longer delays require progress bars

Searching database for matches

slide-36
SLIDE 36

Heuristics

slide-37
SLIDE 37

Heuristics

“Mailto”, “protocol”? Match system to real world

Speak the person’s language

slide-38
SLIDE 38

Heuristics

slide-39
SLIDE 39

Heuristics

Flexibility and Efficiency of Use

accelerators for experts (e.g., keyboard shortcuts) allow tailoring of frequent actions (e.g., macros)

slide-40
SLIDE 40

Heuristics

slide-41
SLIDE 41

Heuristics

Help recognize, diagnose, & recover from errors

error messages in plain language precisely indicate the problem constructively suggest a solution

slide-42
SLIDE 42

Heuristics

slide-43
SLIDE 43

Heuristics

User Control and Freedom Prevent Errors

slide-44
SLIDE 44

Heuristics

slide-45
SLIDE 45

Heuristics

Prevent Errors

slide-46
SLIDE 46

Heuristics

slide-47
SLIDE 47

Heuristics

User control & freedom

provide “exits” for mistaken choices, undo, redo don’t force down fixed paths

Wizards

must respond to question before going to next good for beginners, infrequent tasks not for common tasks consider having 2 versions (WinZip)

slide-48
SLIDE 48

Heuristics

slide-49
SLIDE 49

Heuristics

Consistency & Standards

slide-50
SLIDE 50

Heuristics

slide-51
SLIDE 51

Heuristics

slide-52
SLIDE 52

Heuristics

slide-53
SLIDE 53

How to Perform Heuristic Evaluation

At least two passes for each evaluator

first to get feel for flow and scope of system second to focus on specific elements

If system is walk-up-and-use or evaluators are domain experts, no assistance needed

  • therwise might supply evaluators with scenarios

Each evaluator produces list of problems

explain why with reference to heuristic be specific & list each problem separately

slide-54
SLIDE 54

Example Heuristic Violation

  • 1. [H4 Consistency]

The interface used the string "Save" on the first screen for saving the person’s file, but used the string "Write file" on the second screen. People may be confused by this different terminology for the same function.

slide-55
SLIDE 55

How to Perform Heuristic Evaluation

Why separate listings for each violation?

risk of a ‘fix’ repeating some problematic aspect may not be possible to fix all problems

Where problems may be found

single location in interface two or more locations that need to be compared problem with overall structure of interface something that is missing

common problem with paper prototypes, but sometimes features are implied and just not yet “implemented”

slide-56
SLIDE 56

Phases of Heuristic Evaluation

1) Pre-evaluation training

give expert evaluators needed domain knowledge & information on the scenario

2) Evaluation

individuals evaluate interface and make lists of problems

3) Severity rating

determine how severe each problem is

4) Aggregation

group meets and aggregates problems (w/ ratings)

5) Debriefing

discuss the outcome with design team

slide-57
SLIDE 57

Severity Rating

Used to allocate resources to fix problems Estimates of need for more usability efforts Combination of

frequency impact persistence (one time or repeating)

Should be calculated after all evaluations are in Should be done independently by all judges

slide-58
SLIDE 58

Severity Rating

0 - Do not agree this is a problem. 1 - Usability blemish. Mild annoyance or cosmetic problem. Easily avoidable. 2 - Minor usability problem. Annoying, misleading, unclear, confusing. Can be avoided or easily learned. May occur only once. 3 - Major usability problem. Prevents people from completing tasks. Highly confusing

  • r unclear. Difficult to avoid. Likely to occur more than once.

4 - Critical usability problem. People will not be able to accomplish their goals. People may quit using system all together.

slide-59
SLIDE 59

Example Heuristic Violation

  • 1. [H4 Consistency] [Severity 3]

The interface used the string "Save" on the first screen for saving the person’s file, but used the string "Write file" on the second screen. People may be confused by this different terminology for the same function.

slide-60
SLIDE 60

Why Multiple Evaluators?

Every evaluator does not find every problem Good evaluators find both easy & hard ones

slide-61
SLIDE 61

Debriefing

Conduct with evaluators, observers, and development team members Discuss general characteristics of interface Suggest potential improvements to address major usability problems Development team rates how hard to fix Make it a brainstorming session

slide-62
SLIDE 62

Fixability Scores

1 - Nearly impossible to fix. Requires massive re-engineering or use of new technology. Solution not known or understood at all. 2 - Difficult to fix. Redesign and re-engineering

  • required. Significant code changes. Solution

identifiable but details not fully understood. 3 - Easy to fix. Minimal redesign and straightforward code changes. Solution known and understood. 4 - Trivial to fix. Textual changes and cosmetic

  • changes. Minor code tweaking.
slide-63
SLIDE 63

Example Heuristic Violation

  • 1. [H4 Consistency] [Severity 3] [Fix 4]

The interface used the string "Save" on the first screen for saving the person’s file, but used the string "Write file" on the second screen. People may be confused by this different terminology for the same function. Fix: Change second screen to "Save".

slide-64
SLIDE 64

Results of Using HE

Discount: benefit-cost ratio of 48

cost was $10,500 for benefit of $500,000 how might we calculate this value?

in-house  productivity; open market  sales

Single evaluator achieves poor results

  • nly finds 35% of usability problems

5 evaluators find ~ 75% of usability problems why not more evaluators?

Nielsen, 1994

slide-65
SLIDE 65

Decreasing Returns

problems found benefits / cost

Nielsen, 1994

slide-66
SLIDE 66

Alternative Inspection-Based Methods

Cognitive Walkthrough

Surfaces different types of usability problems Consider as a complement to heuristic evaluation

Action Analysis

Low-level modeling of expert performance Be aware of GOMS, but may never encounter it

slide-67
SLIDE 67

Cognitive Walkthrough

Evaluation method based on:

A person works through an interface in an exploratory manner A person has goals The person is applying means-ends reasoning to work out how to accomplish these goals

Evaluation by an expert, who goes through a task while simulating this cognitive process

slide-68
SLIDE 68

Preparation: Need Four Things

1) Person description, including level of experience and any assumptions made by the designer 2) System description (e.g., paper prototype) 3) Task description, specifying the task the expert has to carry out, from a person’s point of view 4) Action sequence describing the system display and the actions needed to complete the task. One system display and one action together are one step.

slide-69
SLIDE 69

Cognitive Walkthrough Process

Designer/Developer prepares the required documents described on previous slide Gives these documents to the usability expert Expert reads the descriptions, carries out the task by following the action list At each step in action list, asks four questions Record problems similar to heuristic evaluation

slide-70
SLIDE 70

Believability

1) Will the person be trying to produce whatever effect the action has? 2) Will the person be able to notice that the correct action is available? 3) Once the person finds the correct action at the interface, will they know that it is the right one for the effect they are trying to produce? 4) After the action is taken, will the person understand the feedback given?

slide-71
SLIDE 71

Action Analysis / Cognitive Modeling

GOMS: Goals, Operators, Methods, Selection

Developed by Card, Moran and Newell Walk through sequence of steps Assign each an approximate time duration Sum to estimate overall performance time

  • 1. Select sentence

Reach for mouse H 0.40 Point to first word P 1.10 Click button down K 0.60 Drag to last word P 1.20 Release K 0.60 3.90 secs

slide-72
SLIDE 72

Inspection vs. Usability Testing

Inspection

Is much faster Does not require interpreting participant actions May miss problems or find false positives

Usability testing

More accurate, by definition Account for actual people and tasks

One approach is to alternate between them

Find different problems, conserve participants

slide-73
SLIDE 73

CSE 440: Introduction to HCI

User Interface Design, Prototyping, and Evaluation

James Fogarty Kailey Chan Dhruv Jain Nigini Oliveira Chris Seeds Jihoon Suh Lecture 11: Inspection Tuesday / Thursday 12:00 to 1:20

slide-74
SLIDE 74

Phases of Heuristic Evaluation

1) Pre-evaluation training

give expert evaluators needed domain knowledge & information on the scenario

2) Evaluation

individuals evaluate interface and make lists of problems

3) Severity rating

determine how severe each problem is

4) Aggregation

group meets and aggregates problems (w/ ratings)

5) Debriefing

discuss the outcome with design team