Human-Computer Interaction 12. Evaluating User Interface (3) Dr. - - PowerPoint PPT Presentation

human computer interaction
SMART_READER_LITE
LIVE PREVIEW

Human-Computer Interaction 12. Evaluating User Interface (3) Dr. - - PowerPoint PPT Presentation

Human-Computer Interaction 12. Evaluating User Interface (3) Dr. Sunyoung Kim School of Communication & Information Rutgers university Recap: Why doing evaluation? If we build a product, service, an interface, etc., how do we know:


slide-1
SLIDE 1
  • Dr. Sunyoung Kim

School of Communication & Information Rutgers university

Human-Computer Interaction

  • 12. Evaluating User Interface (3)
slide-2
SLIDE 2

Recap: Why doing evaluation?

  • If we build a product, service, an interface, etc., how do we know:

§ Whether it’s any good? § Whether the interface (between a system and user) meets requirements and criteria? § Whether the users are able to complete all important tasks?

à Test Usability

slide-3
SLIDE 3

Recap: What is usability?

“The effectiveness, efficiency and satisfaction with which a specified set of users can achieve a specified set of tasks in a particular environment.” (by ISO)

  • 5 E’s
  • Effective: Can a user reach one’s goals?
  • Find what they are looking for?
  • Do what they want to do?
  • Efficient: How fast to pursue the goals?
  • Number of steps
  • Engaging: Use it again? Recommend it to others?
  • Number of revisits
  • Error tolerant
  • Number of errors
  • Recovering from errors
  • Easy to learn
  • Amount of effort to learn

Satisfaction

slide-4
SLIDE 4

Recap: When to evaluate?

  • Throughout the design process
  • From the first descriptions, sketches, etc. of users needs through to the

final product

  • Design proceeds through interactive cycles of “design – test -

redesign”

  • Evaluation is a key ingredient for a successful design

User testing User testing User testing Paper sketches Wireframing Interactive prototyping Coding User testing

slide-5
SLIDE 5

Recap: Cognitive Walkthrough

A usability evaluation method in which one or more evaluators work through a series of tasks and ask a set of questions from the perspective of the user. The focus of the cognitive walkthrough is on understanding the system's learnability for new or infrequent users

  • To see whether or not a new user can easily carry out tasks within

a given system

  • A task-specific approach to usability
slide-6
SLIDE 6

Recap: Three Questions to be Asked

The cognitive walkthrough is structured around 3 questions that you ask of every step (or action) in the task. You ask these questions before, during and after each step (or action) of the task. If you find a problem, you make a note and then move on to the next step of the task. 1. Visibility: Is the control for the action visible to the user? 2. Affordance: Is there a strong link between the control and the action? (Will the user notice that the correct action is available?) 3. Feedback: Is feedback appropriate? (Will the user properly interpret the system response?)

slide-7
SLIDE 7

Recap: Heuristic Evaluation

A principle or “a rule of thumb” which can be used to identify usability problems in interaction design: a researcher walks through a product and compare it to the heuristics and make their own assessment as to whether the product follows these rules of thumb

  • r not (the “heuristics”)
  • To see whether or not a given system has any usability flaws
  • A more holistic usability inspection
  • Developed by Jakob Nielsen (1994)
  • Can be performed on working UI or on sketches
slide-8
SLIDE 8
  • Good ergonomics
  • Personalization
  • Privacy and social conventions
slide-9
SLIDE 9

Recap: Heuristic Evaluation

Advantages

  • It’s fast, quick and cheap to conduct a heuristic evaluation. This is

especially true if you are only going to use a single evaluator.

  • It provides good insight into possible usability problems that might

damage the user experience. Problems

  • A single evaluator may miss issues that are not readily apparent to
  • them. The use of more than one evaluator is recommended.
  • Heuristic analysis is subjective. It does not “prove” anything and thus

findings may be open to debate.

  • Experienced evaluators are hard to come by and that means that you

may need to use less skilled evaluators whose findings may not be as valuable.

slide-10
SLIDE 10

Today’s agenda

Evaluating User Interface

  • Evaluation with users
slide-11
SLIDE 11

How to evaluate?

  • Asking experts

– Experts’ opinions, inspections, walkthroughs – How do experts think the users will perform on a system?

  • Asking users

– User opinions – how do users think they will perform on a system?

  • 1. Testing users’ performance: user testing
  • Evaluate with prototypes, portions of UI, whole UI
  • How do the users perform on a system based on performance on test
  • 2. Modeling users’ task performance
  • Simulation of user performance
  • How do the users will perform on a system based on simulation
slide-12
SLIDE 12

Usability testing (user testing)

  • User testing is to check whether a developed system is usable

(measured on selected criteria) by the intended user population for their tasks

  • Goals & questions focus on how well users perform tasks with the

product

  • Sometimes compare multiple products or prototypes
  • Focus is often on:

– Time to complete task – Number & type of errors

  • Data collected: video & user interaction logged
  • User satisfaction questionnaires (survey) provide data about user
  • pinions
slide-13
SLIDE 13

Usability testing: steps

1. Define the objectives 2. List the tasks that will be performed 3. Decide methodologies 4. Conduct a pilot test 5. Choose your users 6. Create timetable and task description 7. Choose the location 8. Collect data 9. Analyze the data

slide-14
SLIDE 14
  • 1. Define the objectives
  • What questions do you want to answer with the usability test?
  • What hypothesis do you want to test with the usability test?
slide-15
SLIDE 15
  • 2. List the tasks that will be performed
  • Examples: Register an account, sign into your account, upload a

photo, accept a friend request

  • Create an evaluation script
  • Forms to use when asking for permission to record
  • Informed consent: Minimal risk, information, comprehension,

voluntariness, participants rights, nondisclosure, confidentiality, waivers, legalese, expectations

  • Ethics: we have a responsibility
  • To make participation voluntary with informed consent to avoid

pressure to participate

  • To let them know they can stop at any time
  • To stress that we are testing the system, not them
  • To make data as anonymous as possible
slide-16
SLIDE 16
  • 3. Decide methodologies
slide-17
SLIDE 17
  • 4. Conduct a pilot test
  • To make sure everything is ready prior to conducting a usability test
  • The pilot test allows you to:
  • Test the equipment
  • Provides practice for the facilitator and note-takers
  • Get a good sense whether your questions and scenarios are clear

to the participant

  • Make any last minute adjustments
slide-18
SLIDE 18
  • 5. Choose your users
  • Who is a real user?

– Users who reflect the different skills, domain knowledge, system experience Determine:

  • Whether users working along or in pairs/teams
  • Number of participants
  • Offering incentives

– Thank you letter, pay for out-of-pocket expenses, samples, gifts

  • Recruiting screeners and pre-test questionnaires
slide-19
SLIDE 19
  • 6. Create timetable and task description
  • Length of a session
  • Decide the duration of the evaluation session 30-90 minutes
  • Create an evaluation timetable
  • sessions, evaluation, reporting
  • Preparing task descriptions
  • the tasks the participant will perform while interacting with the

prototype during the evaluation

slide-20
SLIDE 20
  • 7. Choose the location
  • Field studies – user’s own environment, the place the actual system

will be used

  • Controlled studies – other than user’s environment
slide-21
SLIDE 21
  • 8. Collect data
  • Timing and logging actions
  • Automatic logging of keystrokes and mouse clicks
  • Usability testing tools, logging software for usability evaluations

(e.g., http://www.usefulusability.com/24-usability-testing-tools/)

  • Video and audio recording
  • Eye-tracking equipment
slide-22
SLIDE 22

Testing environment – usability lab

slide-23
SLIDE 23

Testing environment – usability lab

slide-24
SLIDE 24

Testing environment – usability lab

slide-25
SLIDE 25

Testing environment – usability lab

slide-26
SLIDE 26

Testing environment – mobile usability testing

slide-27
SLIDE 27
  • 8. Analyze the data
  • Collating data
  • Summarizing data
  • Extract key comments from collected data
  • Quantitative data
  • Tabulations, charts
  • Descriptive statistics: mean median
  • Inferential statistics: tests of statistical significance
  • Qualitative data
  • Grouping comments or observed problems– Establish a coding

scheme

slide-28
SLIDE 28

Review data to identify usability problems Usability defects could

  • Irritate or confuse the user
  • Make a system hard to install, learn, or use
  • Cause mental overload for the user
  • Cause poor user performance (e.g., slow)
  • Violate design standards or guidelines
  • Reduce trust or credibility of the system
  • Tend to cause repeated errors
  • Make the system hard to market
  • 8. Analyze the data
slide-29
SLIDE 29

The differences between mobile and desktop

  • Mobile is mobile: the context constantly changes (location,

connectivity, the way they hold or deal with the device)

  • Users are going to prefer short, simple interactions on mobile
  • A wealth of new personal data generation: GPS, pictures, their friends

and family, communication data between colleagues, friends, etc.

  • Mobile creates real privacy concerns: contain far more personal data

than a desktop typically would.

  • Everyone has a different device with slightly differing capabilities.
  • Mobile also encompasses tablets.
  • Mobile offers different forms of input.
  • Mobile users have profoundly different needs than desktop users.
slide-30
SLIDE 30

Design process

slide-31
SLIDE 31

New sketching/prototyping tool

https://aiexperiments.withgoogle.com/autodraw http://www.autodraw.com

slide-32
SLIDE 32

Individual assignment

slide-33
SLIDE 33

#6 Final pr #6 Final presentation (20%) esentation (20%)

Present your individual assignment. You will have only 5 minutes to present your individual assignment so that you want to make your presentation concise but still to deliver sufficient contents including:

  • Problem statement
  • Proposed solution
  • Ideation (sketches and lo-fi prototypes)
  • Wireframe
  • Evaluation (cognitive walkthrough)

* Check presentation evaluation criteria (https://goo.gl/forms/bTdBBDsOt3369LhY2)

  • After the presentation, the entire class (your peer) will evaluate the

presentation.

  • Peer evaluation will be 50% of your presentation points (the other 50%

will be from me).

* Disclaimer. Further instruction of this submission can be given verbally during class or through Piazza.

slide-34
SLIDE 34

Group project

slide-35
SLIDE 35

#5 Heuristic Evaluation (Individual assignment) #5 Heuristic Evaluation (Individual assignment)

By 4/19 By 4/19 You will perform a heuristic evaluation. Each of you will serve as a usability expert and be matched with TWO prototypes created by your colleagues to do a heuristic evaluation. You will be given a project description form to understand the prototype. Using a google form (https://goo.gl/forms/wZ9rHSwao05XYbQq2), conduct a heuristic evaluation. Write your assessment and a recommendation for the prototype. The intent is to identify: a) which usability guideline was violated, b) why you think it was violated, c) a severity rating for the violation, and d) a suggested solution.

Remember that Heuristic Evaluation is a task-free approach - meaning you do not tell your system evaluators the exact tasks, but just give them a sufficient description of your system (in terms of functions / tasks supported by the system). You may need to tell them about the current limitations (what parts have not been made interactive).

* Disclaimer. Further instruction of this submission can be given verbally during class or through Piazza.

slide-36
SLIDE 36

#5 Heuristic Evaluation (Individual assignment) #5 Heuristic Evaluation (Individual assignment)

By 4/19 By 4/19 Assessment criteria (5%):

  • 30% - Accuracy of Heuristic Evaluation
  • 30% - Thoroughness of Heuristic Evaluation
  • 30% - Re-design recommendations and lessons learned from the

evaluation process

  • 10% - Overall organization and clarity, etc.

Treat the points above as general guidelines showing the relative importance of the assignment elements.

* Disclaimer. Further instruction of this submission can be given verbally during class or through Piazza.

slide-37
SLIDE 37

#5 Heuristic Evaluation #5 Heuristic Evaluation

BIG JCM Solution Let’s Go Mad House

BHAVNA BHATIA BILLY YU CARLIN AU CHANDLER EDWARDSON CHRISTIAN NEGRI CARLIN AU CHANDLER EDWARDSON CHRISTIAN NEGRI CHRISTOPHER VANOMEN JONATHAN SEIWERT JONATHAN SEIWERT JESSICA KWOK JESSICA KWOK JULIAN JONES JULIAN JONES MAHDI ANAYETULLAH NICHOLAS VIGNALI NISH PATEL TERESSA CLARK TERESSA CLARK NISH PATEL ROHIT ANNAPUREDDY YAN LING TRINH SON

Nameless Creation Scarlet Development Team 1 TMT

BILLY YU CHRISTOPHER VEARY CHRISTOPHER VEARY BHAVNA BHATIA CHRISTOPHER VANOMEN EUGENE KIM DIANE KORONGY DIANE KORONGY MAHDI ANAYETULLAH MARISSA DESIMONE MICHAEL NG EUGENE KIM MARISSA DESIMONE MICHAEL NG NATHAN PIERRE NATHAN PIERRE ROHIT ANNAPUREDDY YAN LING YANG GUO NICHOLAS VIGNALI TRINH SON YANG GUO

Assigned pr Assigned projects

  • jects
slide-38
SLIDE 38

Class exercise

Work with your team members

  • Fill out the project description form (HE_description_form.doc) using

BLUE BLUE font color

  • Your group needs to prepare a short description of your prototypes

and provide it to your system evaluators. You should give the evaluators a sufficient description of your system (in terms of functions / tasks supported by the system). You may also want to tell them about the current limitations (e.g., what parts have not been made interactive). But you do not need to tell your system evaluators the exact steps that need to follow.

  • Convert the file to PDF and upload it to your website (by tonight)
slide-39
SLIDE 39

Next class

  • Quiz #4
  • Individual project final presentation