CS102: Monsoon 2015
1 Course updates Ubicomp reading assignment on course web site, due - - PowerPoint PPT Presentation
1 Course updates Ubicomp reading assignment on course web site, due - - PowerPoint PPT Presentation
CS 102 Human-Computer Interaction Lecture 10: Heuristic Evaluation CS102: Monsoon 2015 1 Course updates Ubicomp reading assignment on course web site, due next Monday Oct 5, before class User understanding phase: writeup due Monday Oct 12
CS102: Monsoon 2015
Course updates
Ubicomp reading assignment on course web site, due next Monday Oct 5, before class User understanding phase: writeup due Monday Oct 12 Hardware requirements for projects? Be sure to review lecture slides, references Experiment for us to receive anon feedback: https:// goo.gl/Vre9dT
2
CS102: Monsoon 2015
Recap
3
CS102: Monsoon 2015
Human factors in CS
A human-centered design mindset, and skills like user
- bservations, interviews, studies, prototype-and-feedback are
essential In the last 25 years, human factors have entered the loop in nearly all areas of CS e.g. Programming languages, Computer Architecture, Operating systems, Debugging, Databases, Networks, Security, ML, … And of course commercial products as well!
4
CS102: Monsoon 2015
Heuristic evaluation
5
CS102: Monsoon 2015
Heuristic evaluation
A “discount” usability engineering method A small set of evaluators are independently asked to find “bugs” in interfaces (usually w.r.t recognized usability principles) Different evaluators tend to find different problems Question: How many evaluators should we use?
6 Heuristic Evaluation of User Interfaces
CS102: Monsoon 2015
Experimental setup
4 interfaces with known problems (First 2 were screenshots, last 2 were live systems)
7 Heuristic Evaluation of User Interfaces
CS102: Monsoon 2015
Who finds what?
Savings experiment (37 subjects):
8 Heuristic Evaluation of User Interfaces
CS102: Monsoon 2015
Problems found
Savings experiment (37 subjects):
9 Heuristic Evaluation of User Interfaces
CS102: Monsoon 2015
Cost to benefit
10 How to conduct a Heuristic Evaluation
CS102: Monsoon 2015
Takeaways
5 evaluators will find ~ 2/3rd of problems Can use this technique even without a live system
11
CS102: Monsoon 2015
Which heuristics?
Analysis of 249 usability problems in 11 projects. Each problem checked against 7 sets of guidelines (101 heuristics) Xerox Star, Macintosh, Sunsoft usability guidelines + Molich and Nielsen, Holcomb and Thorp, Polson and Lewis, Carroll and Rosson
12 Enhancing the Explanatory Power of Usability Heuristics
CS102: Monsoon 2015
Broad categories
What are the broad sets of usability problems? 7 buckets:
- Visibility of system status
- Match between system and real world
- User control & freedom
- Consistency and standards
13
- Error prevention
- Recognition over recall
- Flexibility and efficiency
Enhancing the Explanatory Power of Usability Heuristics
CS102: Monsoon 2015
Top 10 heuristics
Which heuristics catch how many problems?
14 Enhancing the Explanatory Power of Usability Heuristics
CS102: Monsoon 2015
Best practices
Typically 1-2 hour sessions Either evaluator writes up a report or is observed directly (may be ok for the observer to provide help) Evaluator goes through the interface several times and compares them with a list of heuristics (usability principles) Heuristics based on general + domain-specific guidelines (e.g. derived from competitive analysis)
15 How to conduct a heuristic evaluation
CS102: Monsoon 2015
Nielsen’s 10 heuristics
16
CS102: Monsoon 2015
Heuristic #1
Visibility of system status
17
CS102: Monsoon 2015
Heuristic #1
General guideline:
18 http://asktog.com/atc/principles-of-interaction-design/
CS102: Monsoon 2015
Heuristic #2
Match between system and the real world From a recent citizen’s survey: How important is: Use an outsider to test model of the real world
19
CS102: Monsoon 2015
Heuristic #3
User control and freedom e.g. cancel button “please don’t close the browser window until the bank transaction is complete” “You weren’t supposed to press that button on this page!”
20
CS102: Monsoon 2015
Heuristic #4
Consistency and standards Use same L&F Use consistent terminology Predictability of actions, locations of widgets, etc.
21
CS102: Monsoon 2015
Heuristic #5
Errors: wrong mental model vs. slips: errors in execution Preventing slips: Add constraints that make it difficult to commit them Offer suggestions/auto-complete Use good defaults Be forgiving in syntax
22
CS102: Monsoon 2015
Heuristic #5
23 http://www.nngroup.com/articles/slips/
CS102: Monsoon 2015
Heuristic #5
For client-server apps: client side checking is ok But in no case should security/data integrity depend on client side checks!
24
CS102: Monsoon 2015
Heuristic #6
Recognition rather than recall
25 How to conduct a heuristic evaluation
CS102: Monsoon 2015
Heuristic #6
Recognition rather than recall amazon.com
26 How to conduct a heuristic evaluation
CS102: Monsoon 2015
Heuristic #7
Flexibility and efficiency of use
27 How to conduct a heuristic evaluation
CS102: Monsoon 2015
Heuristic #8
Aesthetic and minimalist design Good graphic design (catchall) Keep instructions short
28
CS102: Monsoon 2015
Heuristic #9
Help users recognize, diagnose and recover from errors Think from user’s point of view: provide actionable advice Restate exactly what happened Shift blame to yourself Error has to be understandable by user/and or operator Hide technical details (stack trace) until requested
29
CS102: Monsoon 2015
Heuristic #9
30
CS102: Monsoon 2015
Heuristic #10
Provision of help and documentation Help should be: Searchable Context-sensitive Task-oriented Concrete Short
31
CS102: Monsoon 2015
In-class exercise
- 1. Visibility of system status
- 2. Match between system and real world
- 3. User control and freedom
- 4. Consistency and standards
- 5. Preventing errors
- 6. Recognition over recall
- 7. Flexibility and efficiency
- 8. Aesthetic and minimalist design
- 9. Recover from errors
- 10. Help and Documentation
32