CMSC 20370/30370 Winter 2020 Evaluation Qualitative Methods Case - - PowerPoint PPT Presentation
CMSC 20370/30370 Winter 2020 Evaluation Qualitative Methods Case - - PowerPoint PPT Presentation
CMSC 20370/30370 Winter 2020 Evaluation Qualitative Methods Case Study: Underserved Users Jan 15, 2020 Quiz Time (5-7 minutes). Quiz on DreamGigs Principles of Good Design Administrivia GP0 due on Friday GP project website space
Quiz Time (5-7 minutes).
Quiz on DreamGigs
Principles of Good Design
Administrivia
- GP0 due on Friday
- GP project website space
– Can host on personal web space offered by department – Once we know all the groups, we will start scheduling the project presentations and specify the allotted time per presentation – You are expected to attend all project group presentation days
- IA 2 – a paired assignment is due next Friday
– This is a design critique
Today’s Agenda
- Evaluating your design/prototype/
system
– Usability testing – Inspection methods – Qualitative techniques
USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE USER-CENTERED DESIGN
USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE USER-CENTERED DESIGN
USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE USER-CENTERED DESIGN
Case Study: DreamGigs
- Underserved job seekers
- Based on interviews and speed dating
study of 10 initial design concepts
- Created and evaluated 3 versions of
Dreamgigs
– Use usability tests – Semi-structured interviews with job seekers and social workers
- Use HCI empowerment framework to see
how prototype facilitates job seekers empowerment
What do underserved job seekers need?
How do job seekers react to DreamGigs?
Does DreamGigs empower job seekers?
These questions require evaluation
USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE TWO KINDS OF EVALUATION IN UCD
USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE TWO KINDS OF EVALUATION IN UCD FORMATIVE EVALUATION
USER NEEDS DESIGN/PROTOTYPE IMPLEMENT EVALUATE SUMMATIVE EVALUATION TWO KINDS OF EVALUATION IN UCD
Two kinds of evaluation
- Formative
– Helps us understand the problem and our users to inform our design
- Summative
– Helps us understand how well our design works and how to refine it
Formative evaluation
- What are the top usability issues?
- What works well?
- What does not work as well?
- What are common errors?
- Is the design improving over time?
Summative
- How does our design compare to similar
products?
- Is this design better than before?
- How usable is this design?
- What could be improved in this design?
What will evaluation tell us?
- Will the user like the product?
- Is this product more efficient than past
products?
- How does this product compare to
- thers?
- What are the most pressing usability
issues with this product?
- What is the user experience when using
the product overall? (Think of unboxing)
Evaluation Planning
- Determine the goals.
- Explore the questions.
- Choose the evaluation methods.
- Identify the practical issues.
- Decide how to deal with the ethical
issues.
- Evaluate, analyze, interpret and
present the data.
Case Study: DreamGigs
- Any ethical issues here in terms of
evaluation?
Example Planning Questions
- Why are we evaluating the design?
- Who are the users/participants?
- How many participants are needed?
- What is the budget/timeline/resources?
- What evaluation technique should we
use?
- What kind of data should we collect?
How do you collect data?
- Observe users
- Ask users for their opinions
– interviews/surveys/think aloud/log usage
- Ask experts for their opinions
- Test users performance
So how do we evaluate our designs?
First we need metrics for evaluation
Usability metrics
- No of keystrokes
- Time to complete a task
- User satisfaction
- Error rates
- Physiological measures
- No of mouse clicks etc
A usability metric reveals something about the interac@on between the user and the system
What are guiding principles for usability metrics?
- Effectiveness
– How well can the user complete the task?
- Efficiency
– What is the amount of effort to complete the task?
- Satisfaction
– How satisfied/dissatisfied was the user while completing the task
- Safety
- Learnability
- Memorability
Case Study: DreamGigs
- Initially performed a usability test with 5
social workers and used a “think aloud” study
- This means they asked users to use the
storyboard and say what they are thinking at each step
- Need to capture all this data
- Also usually use a post-study questionnaire
- Is this formative or summative?
- What metrics could they have used?
Basic Usability/Lab Study
What are user experience metrics?
- Pleasurable
- Rewarding
- Fun
- Provocative
- Empowering
- Enlightening
Case Study: DreamGigs
- Not just whether the tool will serve a
purpose but also about…
- How participants felt using it
– E.g. seeing that you can explore jobs that you did not think of for your skill set – “expanding your horizons”
- Used HCI empowerment framework for
this part
Other Evaluation Methods
- Inspection based methods
– Based on skills and expertise of evaluators (no users)
- Empirical methods
– Test with real users
Inspection Methods
- Known as Expert Review/discount
usability methods
- 1. Heuristic evaluation (developed by Jakob
Nielsen)
- 2. Cognitive Walkthroughs
Heuristic evaluation
- Assess interface based on predetermined
criteria
- Have small set of evaluators examine the
interface and judge compliance with recognized usability principles
- Different evaluators find different problems
- Aggregate findings
- Use findings to fix issues/redesign
- Can be used throughout user-centered design
process
Cognitive Walkthrough
- Put yourself in the shoes of the user
- Construct carefully designed tasks to
perform on system spec/mockups
- Walk through activities required to go
from one screen to another (cognitive and operational)
- Review actions needed for each task
- Attempt to predict how users will
behave and what they will encounter
Inspection Methods
- Pros?
– Faster
- HE is 1-2 hours per evaluator vs days/weeks
– HE + CW do not require interpreting user data – Better than no evaluation
- Cons?
– HE may miss problems + misidentify issues – User testing more accurate
Field Study Example
Case Study: DreamGigs
- Field study?
– Pros? – Cons?
Other Popular Techniques
Case Study: DreamGigs
- How else could they have evaluated
DreamGigs?
- What limitations are there to the work
as it is presented?
- Any other questions on DreamGigs?
Summary
- Evaluation is a key part of user-centered design
- Type depends on goals and system being tested
- Choose depending on resources and what you want
feedback on
- Inspection methods do not require as many
resources as user testing
- They are “discount” usability methods but not
without limitations
- Field studies require a lot more work and use
surveys, interviews, think alouds, and logs to gather data
Coming up next class
- Project team discussions
- Come to class
– Ensure that your group checks in with one
- f the TAs on your project progress
– TAs have a short checkpoint form – Q&A with TAs
- Turn in GP0