Human-Computer Interaction
- 22. Evaluating User Interface: Expert Evaluation
SunyoungKim,PhD Todays agenda Evaluation Expert evaluation o - - PowerPoint PPT Presentation
Human-Computer Interaction 22. Evaluating User Interface: Expert Evaluation SunyoungKim,PhD Todays agenda Evaluation Expert evaluation o Cognitive Walkthrough o Heuristic Evaluation Design process (Koberg & Bagnall) Why
§ Whether it’s any good? § Whether the interface (between a system and user) meets requirements and criteria? § Whether the users are able to complete all important tasks?
“The effectiveness, efficiency and satisfaction with which a specified set of users can achieve a specified set of tasks in a particular environment.” (by ISO)
Satisfaction
What about…
final product
redesign”
User testing User testing User testing Paper sketches Wireframing Interactive prototyping Coding User testing
– Experts’ opinions, inspections, walkthroughs – How do experts think the users will perform on a system?
– User opinions – How do users think they will perform on a system?
A usability evaluation method in which one or more evaluators work through a series of tasks and ask a set of questions from the perspective of the user. The focus of the cognitive walkthrough is on understanding the system's learnability for new or infrequent users
a given system
than to read a manual or follow a set of instructions.
First, you need to define the tasks. And then, you need a complete, written list of actions needed to complete the task. E.g., Task: Create a customized voicemail message on an iPhone Actions
1. Tap Voicemail 2. Tap Greeting 3. Tap Custom 4. Tap Record and speak your greeting 5. When you finish, tap Stop 6. To listen to your greeting, tap Play 7. To re-record, repeat steps 4 and 5 8. Tap Save
Sometimes defining the tasks is all you need to do to realize there is a problem with the interface.
(e.g., http://buenavista.typepad.com/buena_vista/2007/06/the_mobile_user.html)
The cognitive walkthrough is structured around 3 questions that you ask of every step (or action) in the task. You ask these questions before, during and after each step (or action) of the task. If you find a problem, you make a note and then move on to the next step of the task. 1. Visibility: Is the control for the action visible to the user? 2. Affordance: Is there a strong link between the control and the action? (Will the user notice that the correct action is available?) 3. Feedback: Is feedback appropriate? (Will the user properly interpret the system response?)
To find problems with hidden or obscured controls E.g. is the button visible? To find issues with context-sensitive menus or controls buried too deep within a navigation system. If the control for the action is non- standard or unintuitive then it will identify those as well.
Will the user notice that the correct action is available? To find problems with ambiguous or jargon terms, or with other controls that look like a better choice
Will the user properly interpret the system response? To find problems when feedback is missing, or easy to miss, or too brief, poorly worded, inappropriate or ambiguous. For example, does the system prompt users to take the next step in the task?
A principle or “a rule of thumb” which can be used to identify usability problems in interaction design: a researcher walks through a product and compare it to the heuristics and make their own assessment as to whether the product follows these rules of thumb
1. Know what you will test and how: Before you begin any form of usability testing or user research it is essential for you to have an
2. Understand users: You also need some background on your users. This form of testing doesn’t involve users but your evaluators need to be able to act on behalf of the user 3. Briefing session to tell experts what to do. Provide experts with task descriptions 4. Evaluation in which: – Each expert works separately – Take one pass to get a feel for the product – Take a second pass to focus on specific features 5. Debriefing session in which experts work together to prioritize problems
Keep users informed about what is going on. Example: response time
focused on action
bars
Users should always be aware of what is going on.
The elements and terms used in your system should match those used in the real world as closely as possible.
Users don’t like to be trapped! Strategies
undo, redo
actions easy to perform
exit" signs
can, without being annoying or
Be consistent and follow accepted industry standards in your site
Help users recover from an error by giving a precise description of what the error is, why it occurred, and possible solutions for recovering from the error.
Help users recover from an error by giving a precise description of what the error is, why it occurred, and possible solutions for recovering from the error.
Eliminate error-prone conditions or check for them and ask for confirmation.
Aid users with specifying correct input.
Minimize the user’s memory load by making objects, actions, and options visible.
Minimize the user’s memory load by making objects, actions, and options visible.
it comes to finding content on your site.
efficient manner.
Do not offer more than is required for the user to perform a task. Be aesthetically pleasing.
Occam’s razor: Remove or hide irrelevant or rarely needed information –They compete with important information on screen
Present information in a natural order.
Help should be:
Advantages
especially true if you are only going to use a single evaluator.
damage the user experience. Problems
findings may be open to debate.
may need to use less skilled evaluators whose findings may not be as valuable.
1. Visibility of system status and losability/ findability of the mobile device 2. Match between system and the real world 3. Consistency and mapping (standards) 4. Good ergonomics and minimalist design 5. Ease of input, screen readability and glancability: 6. Flexibility, efficiency of use and personalization 7. Aesthetic, privacy and social conventions: 8. Realistic error management
Mobile
“No matter how much developers and designers try to cater to the feedback and opinions that they receive on their work, they are simply unable to cater to everybody.” - Crystal “The characteristics that separate mobile and desktop computing systems are just too great to ignore. I find it interesting how some of the criteria developed for mobile heuristics evaluations mirrors that of Norman’s design principles.” - Peter “It was interesting when the authors stated that “mobile heuristics detect less cosmetic problems and that, in any case, they should not be considered as alternative to user studies but synergic,” (126) or used in conjunction.” - Phoenix
This is an individual assignment! You will perform a heuristic evaluation. Each of you will serve as a usability expert and be matched with TWO prototypes created by your colleagues to do a heuristic evaluation. You will be given a project description form to understand the
https://goo.gl/forms/zZXMWkQSu89n6kn83), conduct a heuristic evaluation. Write your assessment and a recommendation for the prototype. The intent is to identify: a) which usability guideline was violated, b) why you think it was violated, c) a severity rating for the violation, and d) a suggested solution.
* Disclaimer. Further instruction of this submission can be given verbally during class or through Piazza.
Remember that Heuristic Evaluation is a task-free approach - meaning you do not tell your system evaluators the exact tasks, but just give them a sufficient description of your system (in terms of functions / tasks supported by the system) via a project description. You may need to tell them about the current limitations (e.g., what parts have not been made interactive). Rubric
Treat the points above as general guidelines showing the relative importance of the assignment elements. Start by 12/1 Due by midnight 12/4
* Disclaimer. Further instruction of this submission can be given verbally during class or through Piazza.
Assigned projects
Stor Storm W m Watchers atchers Halsa Halsa Co Co Optimized F&W Optimized F&W Team W eam Weather eather 1DerBr 1DerBread ead Health T Health Technicians echnicians
Scott M. Aimee D. Zyd T. Scott M. Suraj G. Aimee D. Adrian D. Elese C. Krysti L. Adrian D. Elese C. Suraj G. Calvin S. Connor A. Kory D. Stan L. Connor A. Pauline M. Phoenix H. John K. Chanel B. Juwan G. Stan L. John K. Jairo R. Apoorva S. Muhammad A. Kyle K. Juwan G. Apoorva S. Luis T. Roman A. Brooke D. Luis T. Kyle K. Roman A.
Imagine Y Imagine Yours
Immunitrack Immunitrack Phar PharmAssist mAssist Pr Preg eg-Edible
RUSafe RUSafe
Christian T. Christian T. Zyd A. Angela B. Angela B. Kathy L. Kathy L. Krysti L. Winnie C. Winnie C. Peter H. Pauline M. Peter H. Kristy L. Calvin S. Joel R. Lauren R. Chanel B. Kory D. Phoenix H. Lauren R. Albertano S. Muhammad A. Crystal A. Jairo R. Albertano S. Joel R. Brooke D. Joseph B. Crystal A. Joseph B.
Each group needs a project description to inform evaluators about the project.
http://www.sunyoungkim.org/class/hci_f17/other/HE_description_form.docx
by midnight this Thursday (11/30)