CMSC 20370/30370 Winter 2020 Evaluation Qualitative Methods Case - - PowerPoint PPT Presentation

cmsc 20370 30370 winter 2020 evaluation qualitative
SMART_READER_LITE
LIVE PREVIEW

CMSC 20370/30370 Winter 2020 Evaluation Qualitative Methods Case - - PowerPoint PPT Presentation

CMSC 20370/30370 Winter 2020 Evaluation Qualitative Methods Case Study: Underserved Users Jan 15, 2020 Quiz Time (5-7 minutes). Quiz on DreamGigs Principles of Good Design Administrivia GP0 due on Friday GP project website space


slide-1
SLIDE 1

CMSC 20370/30370 Winter 2020 Evaluation – Qualitative Methods Case Study: Underserved Users

Jan 15, 2020

slide-2
SLIDE 2

Quiz Time (5-7 minutes).

Quiz on DreamGigs

Principles of Good Design

slide-3
SLIDE 3

Administrivia

  • GP0 due on Friday
  • GP project website space

– Can host on personal web space offered by department – Once we know all the groups, we will start scheduling the project presentations and specify the allotted time per presentation – You are expected to attend all project group presentation days

  • IA 2 – a paired assignment is due next Friday

– This is a design critique

slide-4
SLIDE 4

Today’s Agenda

  • Evaluating your design/prototype/

system

– Usability testing – Inspection methods – Qualitative techniques

slide-5
SLIDE 5

USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE USER-CENTERED DESIGN

slide-6
SLIDE 6

USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE USER-CENTERED DESIGN

slide-7
SLIDE 7

USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE USER-CENTERED DESIGN

slide-8
SLIDE 8

Case Study: DreamGigs

  • Underserved job seekers
  • Based on interviews and speed dating

study of 10 initial design concepts

  • Created and evaluated 3 versions of

Dreamgigs

– Use usability tests – Semi-structured interviews with job seekers and social workers

  • Use HCI empowerment framework to see

how prototype facilitates job seekers empowerment

slide-9
SLIDE 9

What do underserved job seekers need?

slide-10
SLIDE 10

How do job seekers react to DreamGigs?

slide-11
SLIDE 11

Does DreamGigs empower job seekers?

slide-12
SLIDE 12

These questions require evaluation

slide-13
SLIDE 13

USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE TWO KINDS OF EVALUATION IN UCD

slide-14
SLIDE 14

USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE TWO KINDS OF EVALUATION IN UCD FORMATIVE EVALUATION

slide-15
SLIDE 15

USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE SUMMATIVE EVALUATION TWO KINDS OF EVALUATION IN UCD

slide-16
SLIDE 16

Two kinds of evaluation

  • Formative

– Helps us understand the problem and our users to inform our design

  • Summative

– Helps us understand how well our design works and how to refine it

slide-17
SLIDE 17

Formative evaluation

  • What are the top usability issues?
  • What works well?
  • What does not work as well?
  • What are common errors?
  • Is the design improving over time?
slide-18
SLIDE 18

Summative

  • How does our design compare to similar

products?

  • Is this design better than before?
  • How usable is this design?
  • What could be improved in this design?
slide-19
SLIDE 19

What will evaluation tell us?

  • Will the user like the product?
  • Is this product more efficient than past

products?

  • How does this product compare to
  • thers?
  • What are the most pressing usability

issues with this product?

  • What is the user experience when using

the product overall? (Think of unboxing)

slide-20
SLIDE 20

Evaluation Planning

  • Determine the goals.
  • Explore the questions.
  • Choose the evaluation methods.
  • Identify the practical issues.
  • Decide how to deal with the ethical

issues.

  • Evaluate, analyze, interpret and

present the data.

slide-21
SLIDE 21

Case Study: DreamGigs

  • Any ethical issues here in terms of

evaluation?

slide-22
SLIDE 22

Example Planning Questions

  • Why are we evaluating the design?
  • Who are the users/participants?
  • How many participants are needed?
  • What is the budget/timeline/resources?
  • What evaluation technique should we

use?

  • What kind of data should we collect?
slide-23
SLIDE 23

How do you collect data?

  • Observe users
  • Ask users for their opinions

– interviews/surveys/think aloud/log usage

  • Ask experts for their opinions
  • Test users performance
slide-24
SLIDE 24

So how do we evaluate our designs?

slide-25
SLIDE 25

First we need metrics for evaluation

slide-26
SLIDE 26

Usability metrics

  • No of keystrokes
  • Time to complete a task
  • User satisfaction
  • Error rates
  • Physiological measures
  • No of mouse clicks etc

A usability metric reveals something about the interac@on between the user and the system

slide-27
SLIDE 27

What are guiding principles for usability metrics?

  • Effectiveness

– How well can the user complete the task?

  • Efficiency

– What is the amount of effort to complete the task?

  • Satisfaction

– How satisfied/dissatisfied was the user while completing the task

  • Safety
  • Learnability
  • Memorability
slide-28
SLIDE 28

Case Study: DreamGigs

  • Initially performed a usability test with 5

social workers and used a “think aloud” study

  • This means they asked users to use the

storyboard and say what they are thinking at each step

  • Need to capture all this data
  • Also usually use a post-study questionnaire
  • Is this formative or summative?
  • What metrics could they have used?
slide-29
SLIDE 29

Basic Usability/Lab Study

slide-30
SLIDE 30

What are user experience metrics?

  • Pleasurable
  • Rewarding
  • Fun
  • Provocative
  • Empowering
  • Enlightening
slide-31
SLIDE 31

Case Study: DreamGigs

  • Not just whether the tool will serve a

purpose but also about…

  • How participants felt using it

– E.g. seeing that you can explore jobs that you did not think of for your skill set – “expanding your horizons”

  • Used HCI empowerment framework for

this part

slide-32
SLIDE 32

Other Evaluation Methods

  • Inspection based methods

– Based on skills and expertise of evaluators (no users)

  • Empirical methods

– Test with real users

slide-33
SLIDE 33

Inspection Methods

  • Known as Expert Review/discount

usability methods

  • 1. Heuristic evaluation (developed by Jakob

Nielsen)

  • 2. Cognitive Walkthroughs
slide-34
SLIDE 34

Heuristic evaluation

  • Assess interface based on predetermined

criteria

  • Have small set of evaluators examine the

interface and judge compliance with recognized usability principles

  • Different evaluators find different problems
  • Aggregate findings
  • Use findings to fix issues/redesign
  • Can be used throughout user-centered design

process

slide-35
SLIDE 35

Cognitive Walkthrough

  • Put yourself in the shoes of the user
  • Construct carefully designed tasks to

perform on system spec/mockups

  • Walk through activities required to go

from one screen to another (cognitive and operational)

  • Review actions needed for each task
  • Attempt to predict how users will

behave and what they will encounter

slide-36
SLIDE 36

Inspection Methods

  • Pros?

– Faster

  • HE is 1-2 hours per evaluator vs days/weeks

– HE + CW do not require interpreting user data – Better than no evaluation

  • Cons?

– HE may miss problems + misidentify issues – User testing more accurate

slide-37
SLIDE 37

Field Study Example

slide-38
SLIDE 38
slide-39
SLIDE 39
slide-40
SLIDE 40
slide-41
SLIDE 41
slide-42
SLIDE 42

Case Study: DreamGigs

  • Field study?

– Pros? – Cons?

slide-43
SLIDE 43

Other Popular Techniques

slide-44
SLIDE 44

Case Study: DreamGigs

  • How else could they have evaluated

DreamGigs?

  • What limitations are there to the work

as it is presented?

  • Any other questions on DreamGigs?
slide-45
SLIDE 45

Summary

  • Evaluation is a key part of user-centered design
  • Type depends on goals and system being tested
  • Choose depending on resources and what you want

feedback on

  • Inspection methods do not require as many

resources as user testing

  • They are “discount” usability methods but not

without limitations

  • Field studies require a lot more work and use

surveys, interviews, think alouds, and logs to gather data

slide-46
SLIDE 46

Coming up next class

  • Project team discussions
  • Come to class

– Ensure that your group checks in with one

  • f the TAs on your project progress

– TAs have a short checkpoint form – Q&A with TAs

  • Turn in GP0
slide-47
SLIDE 47

Get in touch:

Office hours: Fridays 2-4pm (Sign up in advance) or by appointment JCL 355 Email: marshini@uchicago.edu