- Dr. Sunyoung Kim
School of Communication & Information Rutgers university
Human-Computer Interaction
- 12. Evaluating User Interface (3)
Human-Computer Interaction 12. Evaluating User Interface (3) Dr. - - PowerPoint PPT Presentation
Human-Computer Interaction 12. Evaluating User Interface (3) Dr. Sunyoung Kim School of Communication & Information Rutgers university Recap: Why doing evaluation? If we build a product, service, an interface, etc., how do we know:
School of Communication & Information Rutgers university
§ Whether it’s any good? § Whether the interface (between a system and user) meets requirements and criteria? § Whether the users are able to complete all important tasks?
“The effectiveness, efficiency and satisfaction with which a specified set of users can achieve a specified set of tasks in a particular environment.” (by ISO)
Satisfaction
final product
redesign”
User testing User testing User testing Paper sketches Wireframing Interactive prototyping Coding User testing
A usability evaluation method in which one or more evaluators work through a series of tasks and ask a set of questions from the perspective of the user. The focus of the cognitive walkthrough is on understanding the system's learnability for new or infrequent users
a given system
The cognitive walkthrough is structured around 3 questions that you ask of every step (or action) in the task. You ask these questions before, during and after each step (or action) of the task. If you find a problem, you make a note and then move on to the next step of the task. 1. Visibility: Is the control for the action visible to the user? 2. Affordance: Is there a strong link between the control and the action? (Will the user notice that the correct action is available?) 3. Feedback: Is feedback appropriate? (Will the user properly interpret the system response?)
A principle or “a rule of thumb” which can be used to identify usability problems in interaction design: a researcher walks through a product and compare it to the heuristics and make their own assessment as to whether the product follows these rules of thumb
Advantages
especially true if you are only going to use a single evaluator.
damage the user experience. Problems
findings may be open to debate.
may need to use less skilled evaluators whose findings may not be as valuable.
– Experts’ opinions, inspections, walkthroughs – How do experts think the users will perform on a system?
– User opinions – how do users think they will perform on a system?
(measured on selected criteria) by the intended user population for their tasks
product
– Time to complete task – Number & type of errors
1. Define the objectives 2. List the tasks that will be performed 3. Decide methodologies 4. Conduct a pilot test 5. Choose your users 6. Create timetable and task description 7. Choose the location 8. Collect data 9. Analyze the data
photo, accept a friend request
voluntariness, participants rights, nondisclosure, confidentiality, waivers, legalese, expectations
pressure to participate
to the participant
– Users who reflect the different skills, domain knowledge, system experience Determine:
– Thank you letter, pay for out-of-pocket expenses, samples, gifts
prototype during the evaluation
will be used
(e.g., http://www.usefulusability.com/24-usability-testing-tools/)
scheme
Review data to identify usability problems Usability defects could
connectivity, the way they hold or deal with the device)
and family, communication data between colleagues, friends, etc.
than a desktop typically would.
https://aiexperiments.withgoogle.com/autodraw http://www.autodraw.com
Present your individual assignment. You will have only 5 minutes to present your individual assignment so that you want to make your presentation concise but still to deliver sufficient contents including:
* Check presentation evaluation criteria (https://goo.gl/forms/bTdBBDsOt3369LhY2)
presentation.
will be from me).
* Disclaimer. Further instruction of this submission can be given verbally during class or through Piazza.
By 4/19 By 4/19 You will perform a heuristic evaluation. Each of you will serve as a usability expert and be matched with TWO prototypes created by your colleagues to do a heuristic evaluation. You will be given a project description form to understand the prototype. Using a google form (https://goo.gl/forms/wZ9rHSwao05XYbQq2), conduct a heuristic evaluation. Write your assessment and a recommendation for the prototype. The intent is to identify: a) which usability guideline was violated, b) why you think it was violated, c) a severity rating for the violation, and d) a suggested solution.
Remember that Heuristic Evaluation is a task-free approach - meaning you do not tell your system evaluators the exact tasks, but just give them a sufficient description of your system (in terms of functions / tasks supported by the system). You may need to tell them about the current limitations (what parts have not been made interactive).
* Disclaimer. Further instruction of this submission can be given verbally during class or through Piazza.
By 4/19 By 4/19 Assessment criteria (5%):
evaluation process
Treat the points above as general guidelines showing the relative importance of the assignment elements.
* Disclaimer. Further instruction of this submission can be given verbally during class or through Piazza.
BIG JCM Solution Let’s Go Mad House
BHAVNA BHATIA BILLY YU CARLIN AU CHANDLER EDWARDSON CHRISTIAN NEGRI CARLIN AU CHANDLER EDWARDSON CHRISTIAN NEGRI CHRISTOPHER VANOMEN JONATHAN SEIWERT JONATHAN SEIWERT JESSICA KWOK JESSICA KWOK JULIAN JONES JULIAN JONES MAHDI ANAYETULLAH NICHOLAS VIGNALI NISH PATEL TERESSA CLARK TERESSA CLARK NISH PATEL ROHIT ANNAPUREDDY YAN LING TRINH SON
Nameless Creation Scarlet Development Team 1 TMT
BILLY YU CHRISTOPHER VEARY CHRISTOPHER VEARY BHAVNA BHATIA CHRISTOPHER VANOMEN EUGENE KIM DIANE KORONGY DIANE KORONGY MAHDI ANAYETULLAH MARISSA DESIMONE MICHAEL NG EUGENE KIM MARISSA DESIMONE MICHAEL NG NATHAN PIERRE NATHAN PIERRE ROHIT ANNAPUREDDY YAN LING YANG GUO NICHOLAS VIGNALI TRINH SON YANG GUO
Assigned pr Assigned projects
Work with your team members
BLUE BLUE font color
and provide it to your system evaluators. You should give the evaluators a sufficient description of your system (in terms of functions / tasks supported by the system). You may also want to tell them about the current limitations (e.g., what parts have not been made interactive). But you do not need to tell your system evaluators the exact steps that need to follow.