cs160. cs160. valkyriesavage.com valkyriesavage.com modality , - - PowerPoint PPT Presentation

cs160 cs160 valkyriesavage com valkyriesavage com
SMART_READER_LITE
LIVE PREVIEW

cs160. cs160. valkyriesavage.com valkyriesavage.com modality , - - PowerPoint PPT Presentation

cs160. cs160. valkyriesavage.com valkyriesavage.com modality , heuristics, and studies, oh my! July 13, 2015 Valkyrie Savage lets talk about PRG02 final presentations : Wednesday 12 August Anand Varma photographer for National


slide-1
SLIDE 1

modality , heuristics, and studies, oh my!

July 13, 2015 Valkyrie Savage

cs160. valkyriesavage.com cs160. valkyriesavage.com

slide-2
SLIDE 2

let’s talk about PRG02

slide-3
SLIDE 3

final presentations : Wednesday 12 August

slide-4
SLIDE 4

Anand Varma

photographer for National Geographic

slide-5
SLIDE 5

Angel Jimenez Lopez

watch app designer for Yelp

slide-6
SLIDE 6

Modes

mattmendoza on flickr

slide-7
SLIDE 7

Modes: Definition

The same user actions have different effects in different situations. Examples?

slide-8
SLIDE 8

Modes: Examples

slide-9
SLIDE 9
slide-10
SLIDE 10
slide-11
SLIDE 11
slide-12
SLIDE 12

Modal Dialogs

Intended to prevent errors Often just ignored Habituation

slide-13
SLIDE 13

Focus 
 Stealing

slide-14
SLIDE 14

Is this Modal?

Raskin says no, as long as the task that has your locus of attention
 reinforces what mode you’re in.

slide-15
SLIDE 15

A Formal Definition of Modes

An human-machine interface is modal with respect to a given gesture when (1) the current state of the interface is not the user's locus of attention and (2) the interface will execute one among several different responses to the gesture, depending on the system's current state.

  • Jef Raskin, The Humane Interface
slide-16
SLIDE 16

Using Modes in Interfaces

When are they useful? Temporarily restrict users’ actions When logical and clearly visible and easily switchable Drawing with paintbrush vs. pencil

slide-17
SLIDE 17

Using Modes in Interfaces

Why can they be problematic? Big memory burden Source of many serious errors

slide-18
SLIDE 18

Using Modes in Interfaces

How can these problems be fixed? Don’t use modes – redesign system to be modeless Integrate modality tightly into the workflow Display modal state prominently

Wikimedia Commons User Chicoutimi

slide-19
SLIDE 19

Redesigning to A void Modes

Setting the time on a clock

Modal

slide-20
SLIDE 20

Redesigning to A void Modes

Modeless

Setting the time on a clock

slide-21
SLIDE 21

Quasimodes

Set and hold a mode via conscious, continuous action

Shift key to capitalize (vs. Caps Lock) Foot pedal that must remain pressed Pull down menus Muscle tension reminds users they are holding a mode

Also known as “spring-loaded modes”

slide-22
SLIDE 22
slide-23
SLIDE 23

Usability Heuristics

ciarlegio on flickr

slide-24
SLIDE 24

Usability Heuristics

“Rules of thumb” describing features of usable systems

Can be used as design principles Can be used to evaluate a design

Example: Minimize users’ memory load

slide-25
SLIDE 25

Heuristic Evaluation

Developed by Jakob Nielsen (1994) small # (3-5) of evaluators (experts) examine UI evaluators check heuristic compliance different evaluators find different problems evaluators only communicate afterwards to aggregate findings designers use violations to redesign/fix problems performed on working UI or on sketches

slide-26
SLIDE 26

Nielsen’s Ten Heuristics

H-1: Visibility of system status H-2: Match system and real world H-3: User control and freedom H-4: Consistency and standards H-5: Error prevention 
 H-6: Recognition rather than recall H-7: Flexibility and efficiency of use H-8: Aesthetic and minimalist design H-9: Help users recognize, diagnose, recover from errors H-10: Help and documentation

slide-27
SLIDE 27

H-1: Visibility of system status

Keep users informed about what’s going on. if system response time is: 0.1 sec: no special indicators needed 1.0 sec: user tends to lose track of data 10 sec: max. duration if user to stay focused on action Short delays: Hourglass / Beach Ball Long delays: Use percent-done progress bars Overestimate usually better

slide-28
SLIDE 28

H-1: Visibility of system status

Users should always be aware of what is going on

So that they can make informed decision Provide redundant information

slide-29
SLIDE 29

H-2: Match System & W

  • rld

Speak the users’ language Follow real world conventions Pay attention to metaphors

  • Bad example: Mac desktop eject
slide-30
SLIDE 30

H-3: User control & freedom

Users don’t like to be trapped! Strategies Cancel button (or Esc key) for dialog boxes Make the cancel button responsive! Universal undo

slide-31
SLIDE 31

H-3: User control & freedom

“exits” for mistaken choices: undo, redo Don’t force the user down fixed paths Wizards Must respond before going to next step Good for infrequent tasks (e.g., network setup) & beginners Not good for common tasks (zip/unzip)

slide-32
SLIDE 32

H-4: Consistency and standards

Be internally
 consistent

slide-33
SLIDE 33

H-4: Consistency and Standards

Obey conventions of your genre

http://www.useit.com/alertbox/application-mistakes.html

slide-34
SLIDE 34

H-5: Error Prevention

Eliminate error-prone conditions

  • r check for them and ask for

confirmation

slide-35
SLIDE 35

H-5: Error Prevention

Aid users with specifying correct input

slide-36
SLIDE 36

H-5: Error Prevention

MIT Scratch Lego Mindstorms

Don’t allow incorrect input

slide-37
SLIDE 37

Preventing Errors : Error Types

Slips : User commits error during the execution of a correct plan. Typos Habitually answer “no” to a dialog box Forget the mode the application is in Mistakes : User correctly executes flawed mental plan Usually the result of a flawed mental model – harder to guard against

slide-38
SLIDE 38

H-6: Recognition over Recall

slide-39
SLIDE 39

H-6: Recognition over Recall

Minimize the user’s memory load by making objects, actions, and

  • ptions visible.
slide-40
SLIDE 40

H-7: flexibility & efficiency of use

Accelerators for experts Gestures, keyboard shortcuts Allow users to tailor frequent actions Macros, toolbars

slide-41
SLIDE 41

H-8: aesthetic & minimalist design

http://4sysops.com/wp-content/uploads/2006/04/Bulk_Rename_Utility.gif

slide-42
SLIDE 42

H-8: aesthetic & minimalist design

Avoid irrelevant information in dialogues

slide-43
SLIDE 43

H-8: aesthetic & minimalist design

Present information in natural order Occam’s razor Remove or hide irrelevant/rarely needed info – competes with important info Pro: Palm Pilot Against: Dynamic menus Use windows frugally : avoid complex window management

From Cooper’s “About face

slide-44
SLIDE 44

H-8: aesthetic & minimalist design

slide-45
SLIDE 45
slide-46
SLIDE 46

H-9: Help users recognize, diagnose, & recover from errors

slide-47
SLIDE 47

Good Error Messages

From Cooper’s “About Face 2.0”

slide-48
SLIDE 48

H-9: Help users recognize, diagnose, & recover from errors

slide-49
SLIDE 49

H-10: help & documentation

Help should: be easy to search focus on the user’s task list concrete steps to carry out not be too long

slide-50
SLIDE 50

Types of Help

Tutorial and/or getting started manuals Presents conceptual model Basis for successful explorations Provides on-line tours and demos Demonstrates basic features Reference manuals Designed with experts in mind Reminders Short reference cards, keyboard templates, tooltips…

slide-51
SLIDE 51

Types of Help

Context sensitive help Search

slide-52
SLIDE 52

New User Guides

First-run experience How do you get it back once you’ve told it you’re done?

slide-53
SLIDE 53

Running a Heuristic Evaluation

foodclothingshelter on flickr

slide-54
SLIDE 54

Phases of Heuristic Eval. (1-2)

1) Pre-evaluation training Provide the evaluator with domain knowledge if needed 2) Evaluation Individuals evaluate interface then aggregate results Compare interface elements with heuristics Work in 2 passes First pass: get a feel for flow and scope Second pass: focus on specific elements Each evaluator produces list of problems Explain why with reference to heuristic or other information Be specific and list each problem separately

slide-55
SLIDE 55

Phases of Heuristic Eval. (3-4)

3) Severity rating Establishes a ranking between problems Cosmetic, minor, major and catastrophic First rate individually, then as a group 4) Debriefing Discuss outcome with design team Suggest potential solutions Assess how hard things are to fix

slide-56
SLIDE 56

Example

Typography uses mix of upper/lower case formats and fonts Violates “Consistency and standards” (H-4) Slows users down Fix: pick a single format for entire interface Probably wouldn’t be found by user testing

slide-57
SLIDE 57

Severity Rating

Used to allocate resources to fix problems Estimates of need for more usability efforts Combination of Frequency, Impact and Persistence Should be calculated after all evaluations are in Should be done independently by all judges

slide-58
SLIDE 58

Levels of Severity

0 - don’t agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix

slide-59
SLIDE 59

Severity Ratings Example

  • 1. [H-4 Consistency] [Severity ?]

The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function.

slide-60
SLIDE 60

Severity Ratings Example

  • 1. [H-4 Consistency] [Severity 3]

The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function.

slide-61
SLIDE 61

Debriefing

Conduct with evaluators, observers, and development team members

  • Discuss general characteristics of UI
  • Suggest improvements to address major usability problems
  • Development team rates how hard things are to fix
  • Make it a brainstorming session
slide-62
SLIDE 62

Pros & Cons of Heuristic Evaluation

slide-63
SLIDE 63

HE vs. User Testing

HE is much faster

1-2 hours each evaluator vs. days- weeks

  • HE doesn’t require interpreting

user’s actions

  • User testing is far more accurate

Takes into account actual users and tasks HE may miss problems & find “false positives”

  • Good to alternate between HE &

user-based testing

Find different problems Don’t waste participants

slide-64
SLIDE 64

Why Multiple Evaluators?

Every evaluator doesn’t find every problem Good evaluators find both easy & hard ones

slide-65
SLIDE 65

Number of Evaluators

Single evaluator achieves poor results Only finds 35% of usability problems 5 evaluators find ~ 75% of usability problems Why not more evaluators???? 10? 20? Adding evaluators costs more Many evaluators won’t find many more problems But always depends on market for product: popular products = high support cost for small bugs

slide-66
SLIDE 66

Decreasing Returns

Problems Found Benefits / Cost

Caveat: graphs are for one specific example!

slide-67
SLIDE 67

Summary

Heuristic evaluation is a discount method Have evaluators go through the UI twice Ask them to see if it complies with heuristics Note where it doesn’t and say why Have evaluators independently rate severity Combine the findings from 3 to 5 evaluators Discuss problems with design team Cheaper alternative to user testing Finds different problems, so good to alternate

slide-68
SLIDE 68

break!

slide-69
SLIDE 69

assignments!

collegedegrees360 on flickr

slide-70
SLIDE 70

heuristic evaluations

apply to an interface of your choice

slide-71
SLIDE 71

low-fi user test

go forth, and ask users!

slide-72
SLIDE 72

Usability Testing Methods

jepoirrier on flickr

slide-73
SLIDE 73

Genres of assessment

Automated Usability measures computed by software Inspection Based on skills, and experience of evaluators Formal Models and formulas to calculate measures Empirical Usability assessed by testing with real users

slide-74
SLIDE 74

Empirical Testing is Costly

User studies are very expensive – you need to schedule (and normally pay) many subjects. User studies may take many hours of the evaluation team’s time. A user test can easily cost $10k’s

slide-75
SLIDE 75

“Discount Usability” Techniques

Cheap No special labs or equipment needed The more careful you are, the better it gets

  • Fast

On order of 1 day to apply (Even basic usability testing may take two-to-four weeks) Easy to use Can be taught in 2-4 hours

slide-76
SLIDE 76

“Discount Usability” Techniques

Heuristic Evaluation Assess interface based on a predetermined list of criteria Cognitive Walkthroughs Put yourself in the shoes of a user Like a code walkthrough Other, non-inspection techniques are on the rise e.g., online remote experiments with Mechanical Turk

slide-77
SLIDE 77

Cognitive Walkthrough

Given an interface prototype or specification, need: A detailed task with a concrete goal, ideally motivated by a scenario Action sequences for user to complete the task Ask the following questions for each step: Will the users know what to do? Will the user notice that the correct action is available? Will the user interpret the application feedback correctly? Record: what would cause problems, and why?

From: Preece, Rogers, Sharp – Interaction Design

slide-78
SLIDE 78

Empirical Assessment: Qualitative

Qualitative: What we’ve been doing so far Contextual Inquiry: try to understand user’s tasks and conceptual model Usability Studies: look for critical incidents in interface

  • Qualitative methods help us:

Understand what is going on Look for problems Roughly evaluate usability of interface

slide-79
SLIDE 79

Running a Qualitative Study

Define common tasks: easy, medium, hard Recruit participants Ask participants to complete tasks while thinking aloud Note criticial incidents during the course of the study Audio record for quotes Above all else: do not help the user & keep them talking

slide-80
SLIDE 80

Output from a Qualitative Study

A Usability Report Similar to the output form a heuristic evaluation Sections: Methodology & Participant Demographics General reactions from users Positive & negative comments (exact quotes from audio recording) List of usability issues with severity ratings, categories, and images

slide-81
SLIDE 81

Empirical Assessment: Quantitative

Quantitative Use to reliably measure some aspect of interface Compare two or more designs on a measurable aspect Contribute to theory of Human- Computer Interaction Approaches Collect and analyze user events that occur in natural use via controlled experiments Examples of measures Time to complete a task, Average number of errors on a task, Users’ ratings of an interface

slide-82
SLIDE 82

Comparison

  • Qualitative studies

Faster, less expensive. Especially useful in early stages

  • f design cycle

Often all that is needed in industry Quantitative studies Reliable, repeatable result. scientific method Best studies produce generalizable results

slide-83
SLIDE 83

Research Ethics

michaelgallagher on flickr

slide-84
SLIDE 84

The Participants’ Standpoint

Testing is a distressing experience Pressure to perform Feeling of inadequacy Looking like a fool in front of 
 your peers, your boss, …

(from “Paper Prototyping” by Snyder)

slide-85
SLIDE 85

The Three Belmont Report Principles

Respect for Persons Have a meaningful consent process: give information, and let prospective subjects freely chose to participate Beneficence Minimize the risk of harm to subjects, maximize potential benefits Justice Use fair procedures to select subjects 
 (balance burdens & benefits) To ensure adherence to principles, most schools require Institutional Review Board approval of research involving human subjects.

slide-86
SLIDE 86

Treating Subjects With Respect

Follow human subject protocols Individual test results will be kept confidential Users can stop the test at any time Users are aware (and understand) the monitoring technique(s) Their performance will not have implications on their life Records will be made anonymous Use standard informed consent form Especially for quantitative tests Be aware of legal requirements

slide-87
SLIDE 87

Managing Subjects

Don’t waste users’ time Use pilot tests to debug experiments, questionnaires, etc… Have everything ready before users show up

  • Make users comfortable

Keep a relaxed atmosphere Allow for breaks Pace tasks correctly Stop the test if it becomes too unpleasant

slide-88
SLIDE 88

Conducting the Experiment

Before the experiment Have them read and sign the consent form Explain the goal of the experiment in a way accessible to users Be careful about the demand characteristic 
 (Participants biased towards experimenter’s hypothesis) Answer questions During the experiment Stay neutral Never indicate displeasure with users’ performance After the experiment Debrief users (Inform users about the goal of the experiment) Answer any questions they have

slide-89
SLIDE 89

Privacy and Confidentiality

Privacy: having control over the extent, timing, and circumstances of sharing

  • neself with others.

Confidentiality: the treatment of information that an individual has disclosed with the expectation that it will not be divulged Examples where privacy could be violated or confidentiality may be breached in HCI studies?

slide-90
SLIDE 90

Ethics: The Milgram Experiment

Teacher & Learner roles, where learner was actually a confederate of the experimenter. Participants asked to administer shocks to the learner, up to and including what they believed to be lethal electrocutions. Some participants were not properly debriefed and were allowed to leave believing they had killed a fellow participant.

CC by Wikimedia Commons User Fred the Oyster

slide-91
SLIDE 91

Ethics: Stanford Prison Experiment

1971 Experiment by Phil Zimbardo at Stanford 24 Participants – half prisoners, half guards ($15 a day) Basement of Stanford Psychology bldg turned into mock prison Guards given batons, military style uniform, mirror glasses,… Prisoners wore smocks (no underwear), thong sandals, pantyhose caps

[from Wikipedia]

slide-92
SLIDE 92

https://www.youtube.com/watch?v=sZwfNs1pqG0

slide-93
SLIDE 93

Ethics

Was it useful? “…that’s the most valuable kind of information that you can have and that certainly a society needs it” (Zimbardo) Was it ethical? Could we have gathered this knowledge by other means?

http://www.prisonexp.org/slide-42.htm

slide-94
SLIDE 94

Ethics: 
 Facebook Emotional Manipulation Study

Facebook manipulates Timeline of ~700,000 users to display more positive or negative content. Did not obtain proper informed consent But was it needed? Is this type of study different from a standard A/B usability test? former 160 instructor’s opinion on the matter: http://blog.kinoma.com/2014/06/facebooks-human-lab-rats-the-ethics-of- experimentation-without-informed-consent/

slide-95
SLIDE 95

If you want to learn more…

Online human subjects certification courses: E.g., http://phrp.nihtraining.com/users/login.php The Belmont Report: Ethical Principles and Guidelines for the protection of human subjects of research 1979 Government report that describes the basic ethical principles that should underly the conduct of research involving human subjects http://ohsr.od.nih.gov/guidelines/belmont.html

slide-96
SLIDE 96

bring laptops for studio tomorrow! (and do readings!)

slide-97
SLIDE 97

modality , heuristics, and studies, oh my!

July 13, 2015 Valkyrie Savage

cs160. valkyriesavage.com cs160. valkyriesavage.com