SunyoungKim,PhD Todays agenda Evaluation Expert evaluation o - - PowerPoint PPT Presentation

sunyoung kim phd
SMART_READER_LITE
LIVE PREVIEW

SunyoungKim,PhD Todays agenda Evaluation Expert evaluation o - - PowerPoint PPT Presentation

Human-Computer Interaction 22. Evaluating User Interface: Expert Evaluation SunyoungKim,PhD Todays agenda Evaluation Expert evaluation o Cognitive Walkthrough o Heuristic Evaluation Design process (Koberg & Bagnall) Why


slide-1
SLIDE 1

Human-Computer Interaction

  • 22. Evaluating User Interface: Expert Evaluation

SunyoungKim,PhD

slide-2
SLIDE 2

Today’s agenda

  • Evaluation
  • Expert evaluation
  • Cognitive Walkthrough
  • Heuristic Evaluation
slide-3
SLIDE 3

Design process (Koberg & Bagnall)

slide-4
SLIDE 4

Why doing evaluation?

  • If we build a product, service, an interface, etc., how do we know:

§ Whether it’s any good? § Whether the interface (between a system and user) meets requirements and criteria? § Whether the users are able to complete all important tasks?

à Test Usability

slide-5
SLIDE 5

What is usability?

“The effectiveness, efficiency and satisfaction with which a specified set of users can achieve a specified set of tasks in a particular environment.” (by ISO)

  • 5 E’s
  • Effective: Can a user reach one’s goals?
  • Find what they are looking for?
  • Do what they want to do?
  • Efficient: How fast to pursue the goals?
  • Number of steps
  • Engaging: Use it again? Recommend it to others?
  • Number of revisits
  • Error tolerant
  • Number of errors
  • Recovering from errors
  • Easy to learn
  • Amount of effort to learn

Satisfaction

slide-6
SLIDE 6

Identify relative importance of evaluation factor

slide-7
SLIDE 7

Museum website

slide-8
SLIDE 8

Museum exhibition

slide-9
SLIDE 9

Evaluation factors

What about…

  • Self-service filling and payment system for a gas station
  • On-board ship data analysis system for geologists to search for oil
  • Fashion clothing website
  • College online course system
slide-10
SLIDE 10

When to evaluate?

  • Throughout the design process
  • From the first descriptions, sketches, etc. of users needs through to the

final product

  • Design proceeds through interactive cycles of “design – test -

redesign”

  • Evaluation is a key ingredient for a successful design

User testing User testing User testing Paper sketches Wireframing Interactive prototyping Coding User testing

slide-11
SLIDE 11

How to evaluate?

  • Asking experts

– Experts’ opinions, inspections, walkthroughs – How do experts think the users will perform on a system?

  • Asking users

– User opinions – How do users think they will perform on a system?

  • Cognitive Walkthrough
  • Heuristic Evaluation
slide-12
SLIDE 12

Cognitive Walkthrough

slide-13
SLIDE 13

Cognitive Walkthrough

A usability evaluation method in which one or more evaluators work through a series of tasks and ask a set of questions from the perspective of the user. The focus of the cognitive walkthrough is on understanding the system's learnability for new or infrequent users

  • To see whether or not a new user can easily carry out tasks within

a given system

  • A task-specific approach to usability
  • Premise: most users prefer to do things to learn a product rather

than to read a manual or follow a set of instructions.

slide-14
SLIDE 14

Define the tasks and actions needed

First, you need to define the tasks. And then, you need a complete, written list of actions needed to complete the task. E.g., Task: Create a customized voicemail message on an iPhone Actions

1. Tap Voicemail 2. Tap Greeting 3. Tap Custom 4. Tap Record and speak your greeting 5. When you finish, tap Stop 6. To listen to your greeting, tap Play 7. To re-record, repeat steps 4 and 5 8. Tap Save

Sometimes defining the tasks is all you need to do to realize there is a problem with the interface.

(e.g., http://buenavista.typepad.com/buena_vista/2007/06/the_mobile_user.html)

slide-15
SLIDE 15

Three Questions to be Asked

The cognitive walkthrough is structured around 3 questions that you ask of every step (or action) in the task. You ask these questions before, during and after each step (or action) of the task. If you find a problem, you make a note and then move on to the next step of the task. 1. Visibility: Is the control for the action visible to the user? 2. Affordance: Is there a strong link between the control and the action? (Will the user notice that the correct action is available?) 3. Feedback: Is feedback appropriate? (Will the user properly interpret the system response?)

slide-16
SLIDE 16
  • Q1. Visibility: Is the control for the

action visible to users?

To find problems with hidden or obscured controls E.g. is the button visible? To find issues with context-sensitive menus or controls buried too deep within a navigation system. If the control for the action is non- standard or unintuitive then it will identify those as well.

slide-17
SLIDE 17
  • Q2. Affordance: Is there a strong

link between the control and the action?

Will the user notice that the correct action is available? To find problems with ambiguous or jargon terms, or with other controls that look like a better choice

slide-18
SLIDE 18
  • Q3. Feedback: Is feedback

appropriate?

Will the user properly interpret the system response? To find problems when feedback is missing, or easy to miss, or too brief, poorly worded, inappropriate or ambiguous. For example, does the system prompt users to take the next step in the task?

slide-19
SLIDE 19

Who should conduct a Cognitive Walkthrough?

Anyone can conduct a cognitive walkthrough; however, there is a risk that someone who is already familiar with your jargon, language and system is going to miss things that someone who lacks that familiarity would find. If you have to use someone who is very familiar with the product, make sure they have user personas to hand – to try and guide them to “walk a mile in the user’s shoes”.

slide-20
SLIDE 20

What do you do with the answers?

You should record the step in the process where an assessor found an issue and what that issue was. When the process is complete, roundup all the assessors’ reports into a single report and then prioritize issues for fixing.

slide-21
SLIDE 21

Heuristic Evaluation

slide-22
SLIDE 22

Heuristic Evaluation

A principle or “a rule of thumb” which can be used to identify usability problems in interaction design: a researcher walks through a product and compare it to the heuristics and make their own assessment as to whether the product follows these rules of thumb

  • r not (the “heuristics”)
  • To see whether or not a given system has any usability flaws
  • A more holistic usability inspection
  • Developed by Jakob Nielsen (1994)
  • Can be performed on working UI or on sketches
slide-23
SLIDE 23

Heuristic Evaluation: Steps

1. Know what you will test and how: Before you begin any form of usability testing or user research it is essential for you to have an

  • bjective for your testing (Articulate them).

2. Understand users: You also need some background on your users. This form of testing doesn’t involve users but your evaluators need to be able to act on behalf of the user 3. Briefing session to tell experts what to do. Provide experts with task descriptions 4. Evaluation in which: – Each expert works separately – Take one pass to get a feel for the product – Take a second pass to focus on specific features 5. Debriefing session in which experts work together to prioritize problems

slide-24
SLIDE 24
slide-25
SLIDE 25
  • 1. Visibility of system status

Keep users informed about what is going on. Example: response time

  • 0.1 sec: no special indicators needed
  • 1.0 sec: user tends to lose track of data
  • 10 sec: max. duration if user to stay

focused on action

  • Short delays: Hourglass
  • Long delays: Use percent-done progress

bars

  • Overestimating is usually better
slide-26
SLIDE 26
  • 1. Visibility of system status

Users should always be aware of what is going on.

  • So that they can make informed decision
  • Provide redundant information
slide-27
SLIDE 27
  • 2. Match between system and real world

The elements and terms used in your system should match those used in the real world as closely as possible.

  • Speak the users’ language
  • Follow real world conventions
  • Pay attention to metaphors
slide-28
SLIDE 28
  • 3. User control and freedom

Users don’t like to be trapped! Strategies

  • Cancel button(or Esc key) for dialog
  • Make the cancel button responsive!
  • Offer “Exits” for mistaken choices,

undo, redo

  • Don’t force the user down fixed paths
  • Don't make important irreversible

actions easy to perform

  • Provide clearly marked "emergency

exit" signs

  • Ask for 'confirmation' whenever you

can, without being annoying or

  • verprotective.
slide-29
SLIDE 29
  • 4. Consistency and standards

Be consistent and follow accepted industry standards in your site

  • design. There are many accepted conventions on the Internet.
slide-30
SLIDE 30
  • 5. Help users recognize, diagnose, recover

from errors

Help users recover from an error by giving a precise description of what the error is, why it occurred, and possible solutions for recovering from the error.

slide-31
SLIDE 31
  • 5. Help users recognize, diagnose, recover

from errors

Help users recover from an error by giving a precise description of what the error is, why it occurred, and possible solutions for recovering from the error.

slide-32
SLIDE 32

Eliminate error-prone conditions or check for them and ask for confirmation.

  • 6. Error prevention
slide-33
SLIDE 33
  • 6. Error prevention

Aid users with specifying correct input.

slide-34
SLIDE 34
  • 7. Recognition rather than recall

Minimize the user’s memory load by making objects, actions, and options visible.

slide-35
SLIDE 35
  • 7. Recognition rather than recall

Minimize the user’s memory load by making objects, actions, and options visible.

slide-36
SLIDE 36
  • 8. Flexibility and efficiency of use
  • Flexibility: You should offer your users a number of options when

it comes to finding content on your site.

  • Efficiency: Your users should be able to achieve their goals in an

efficient manner.

slide-37
SLIDE 37
  • 9. Aesthetic and minimalist design

Do not offer more than is required for the user to perform a task. Be aesthetically pleasing.

slide-38
SLIDE 38
  • 9. Aesthetic and minimalist design

Occam’s razor: Remove or hide irrelevant or rarely needed information –They compete with important information on screen

  • Use windows frugally
  • Avoid complex window management
slide-39
SLIDE 39
  • 9. Aesthetic and minimalist design
slide-40
SLIDE 40
  • 9. Aesthetic and minimalist design
slide-41
SLIDE 41
  • 9. Aesthetic and minimalist design

Present information in a natural order.

slide-42
SLIDE 42
  • 9. Aesthetic and minimalist design
slide-43
SLIDE 43
  • 10. Help and documentation

Help should be:

  • Easy to search
  • Focused on the user’s task
  • List concrete steps to carry out
  • Not too long
slide-44
SLIDE 44
  • 10. Help and documentation
slide-45
SLIDE 45
slide-46
SLIDE 46

Heuristic Evaluation

Advantages

  • It’s fast, quick and cheap to conduct a heuristic evaluation. This is

especially true if you are only going to use a single evaluator.

  • It provides good insight into possible usability problems that might

damage the user experience. Problems

  • A single evaluator may miss issues that are not readily apparent to
  • them. The use of more than one evaluator is recommended.
  • Heuristic analysis is subjective. It does not “prove” anything and thus

findings may be open to debate.

  • Experienced evaluators are hard to come by and that means that you

may need to use less skilled evaluators whose findings may not be as valuable.

slide-47
SLIDE 47

Mobile heuristics

1. Visibility of system status and losability/ findability of the mobile device 2. Match between system and the real world 3. Consistency and mapping (standards) 4. Good ergonomics and minimalist design 5. Ease of input, screen readability and glancability: 6. Flexibility, efficiency of use and personalization 7. Aesthetic, privacy and social conventions: 8. Realistic error management

slide-48
SLIDE 48
  • Good ergonomics
  • Personalization
  • Privacy and social conventions

Mobile

slide-49
SLIDE 49

Reading responses

“No matter how much developers and designers try to cater to the feedback and opinions that they receive on their work, they are simply unable to cater to everybody.” - Crystal “The characteristics that separate mobile and desktop computing systems are just too great to ignore. I find it interesting how some of the criteria developed for mobile heuristics evaluations mirrors that of Norman’s design principles.” - Peter “It was interesting when the authors stated that “mobile heuristics detect less cosmetic problems and that, in any case, they should not be considered as alternative to user studies but synergic,” (126) or used in conjunction.” - Phoenix

slide-50
SLIDE 50

Group project

slide-51
SLIDE 51

This is an individual assignment! You will perform a heuristic evaluation. Each of you will serve as a usability expert and be matched with TWO prototypes created by your colleagues to do a heuristic evaluation. You will be given a project description form to understand the

  • prototype. Using a google form (

https://goo.gl/forms/zZXMWkQSu89n6kn83), conduct a heuristic evaluation. Write your assessment and a recommendation for the prototype. The intent is to identify: a) which usability guideline was violated, b) why you think it was violated, c) a severity rating for the violation, and d) a suggested solution.

* Disclaimer. Further instruction of this submission can be given verbally during class or through Piazza.

  • P5. Heuristic Evaluation
slide-52
SLIDE 52

Remember that Heuristic Evaluation is a task-free approach - meaning you do not tell your system evaluators the exact tasks, but just give them a sufficient description of your system (in terms of functions / tasks supported by the system) via a project description. You may need to tell them about the current limitations (e.g., what parts have not been made interactive). Rubric

  • Accuracy and thoroughness of Heuristic Evaluation (2pt)
  • Recommendations (2pt)
  • Lessons learned from the evaluation process (1pt)

Treat the points above as general guidelines showing the relative importance of the assignment elements. Start by 12/1 Due by midnight 12/4

* Disclaimer. Further instruction of this submission can be given verbally during class or through Piazza.

  • P5. Heuristic Evaluation
slide-53
SLIDE 53

Assigned projects

  • P5. Heuristic Evaluation

Stor Storm W m Watchers atchers Halsa Halsa Co Co Optimized F&W Optimized F&W Team W eam Weather eather 1DerBr 1DerBread ead Health T Health Technicians echnicians

Scott M. Aimee D. Zyd T. Scott M. Suraj G. Aimee D. Adrian D. Elese C. Krysti L. Adrian D. Elese C. Suraj G. Calvin S. Connor A. Kory D. Stan L. Connor A. Pauline M. Phoenix H. John K. Chanel B. Juwan G. Stan L. John K. Jairo R. Apoorva S. Muhammad A. Kyle K. Juwan G. Apoorva S. Luis T. Roman A. Brooke D. Luis T. Kyle K. Roman A.

Imagine Y Imagine Yours

  • urs

Immunitrack Immunitrack Phar PharmAssist mAssist Pr Preg eg-Edible

  • Edible

RUSafe RUSafe

Christian T. Christian T. Zyd A. Angela B. Angela B. Kathy L. Kathy L. Krysti L. Winnie C. Winnie C. Peter H. Pauline M. Peter H. Kristy L. Calvin S. Joel R. Lauren R. Chanel B. Kory D. Phoenix H. Lauren R. Albertano S. Muhammad A. Crystal A. Jairo R. Albertano S. Joel R. Brooke D. Joseph B. Crystal A. Joseph B.

slide-54
SLIDE 54

Each group needs a project description to inform evaluators about the project.

  • 1. Download the form:

http://www.sunyoungkim.org/class/hci_f17/other/HE_description_form.docx

  • 2. Complete the form
  • 3. Convert the completed form into PDF and upload it to your project website

by midnight this Thursday (11/30)

  • It should be easy to find out on your website
  • It should not require a permission to access it

Class activity

  • P5. Heuristic Evaluation: Project Description