if I cant even get it to open? Assessing Information Interaction at - - PowerPoint PPT Presentation

if i can t even get it to open
SMART_READER_LITE
LIVE PREVIEW

if I cant even get it to open? Assessing Information Interaction at - - PowerPoint PPT Presentation

How do I know if its useful if I cant even get it to open? Assessing Information Interaction at the University of North Texas to Improve Library Collections and Services Erin DeWitt Miller Susan Smith Xin Wang Allyson Rodriguez


slide-1
SLIDE 1

“How do I know if it’s useful if I can’t even get it to open?” Assessing Information Interaction at the University of North Texas to Improve Library Collections and Services

Erin DeWitt Miller Susan Smith Xin Wang Allyson Rodriguez Emily Billings ER&L Conference Austin, Texas April 4, 2017

slide-2
SLIDE 2

Usability Studies – Steps to Success 1.Identify a Need to Know 2.Literature Review 3.Who do you want to know? 4.What do you want to know? 5.Practical Considerations 6.Steps to Prepare 7.Make it Happen 8.Analyze, Evaluate and Respond

2

slide-3
SLIDE 3

Usability Studies – Steps to Success

Plan Execute Evaluate

3

slide-4
SLIDE 4

UNT Studies Outline

 Six tasks  Plus two practice tasks  Three on each platform  Think aloud protocol  Post-task surveys  Exit Survey  Qualitative

4

slide-5
SLIDE 5

Online Video Usability

UNT Libraries, 2016-17

slide-6
SLIDE 6
  • 1. Identify a Need to Know:

Online Video

 Streaming video is expensive &

complicated to manage

 Online video for libraries is

changing fast

 Online video interaction is under-

studied

 Homegrown “Video on Demand”

replacement

6

slide-7
SLIDE 7
  • 2. Literature Review: Online Video

 Comprehensive  68% of students are using video for classes and 79% use video outside of

class for supplementary learning

 Multiple studies have found that video supports student learning and is a

valuable educational tool with benefits including increased motivation, improved retention of information and enhanced comprehension

 Even though the use of online video has become common in postsecondary

education the perception and use of online video by students has not yet been widely researched and a need for better understanding exists

7

slide-8
SLIDE 8
  • 3. Who do you want to know?

Online Video

 Librarians and Students  “Experts” and “Novices”

8

slide-9
SLIDE 9
  • 4. What do You Want to Know?

Online Video

 What are the specific features of an interactive

multimedia platform used most frequently?

 When attempting to complete tasks in an

interactive multimedia platform how do the interactions of experts differ from those of novices?

 Platform specifics

9

slide-10
SLIDE 10
  • 5. Practical Considerations:

Online Video

Equipment needs

Morae Recorder installed on multiple laptops License

10

slide-11
SLIDE 11
  • 5. Practical Considerations:

Online Video

Number of participants: 24 Additional Moderators Internal grant funding for the incentives $10 gift cards to Amazon

11

slide-12
SLIDE 12
  • 6. Prepare: Online Video

 Metrics  Time on Task  Subjective Measures  Rating after each task  Follow up survey for open ended

questions; likes, dislikes and recommendations

 Coding for interaction

12

slide-13
SLIDE 13
  • 6. Prepare: Online Video

Tasks

 3 on each platform  Specific Item Search  General Search for Relevant

Items

 Interactions

Scenarios “You have been assigned…” VS “You are creating an assignment...”

13

  • Searching
  • Browsing
  • Viewing
  • Emailing
  • Sharing
  • Playlist
  • Clipping
  • Embedding
slide-14
SLIDE 14
  • 7. Make it Happen: Online Video

Reserved space in both the Media Library and

the main library

Recruitment Recorded in November and December of 2016 12 librarians and 14 students 26 hours of recording 13 hours of prep time

14

slide-15
SLIDE 15
  • 8. Analyze, Evaluate and Respond:

Online Video

 Roughly split between the two platforms on functionality  Frustration with any functionality beyond searching  General appearance of Kanopy is preferred

 Help pages hard to find

 Searching on Alexander Street preferred by majority of

librarians…but not students

 Faceted searching is confusing for students  Creating accounts  Students positive about being able to create an account

through Facebook

15

slide-16
SLIDE 16
  • 8. Analyze, Evaluate and Respond:

Online Video

"Give popup bubbles that provide more specific instructions on things like making clips and embedding videos." (-student) "The only good thing is that they are streaming

  • video. You can search but beyond that gets

frustrating." (-librarian)

16

slide-17
SLIDE 17
  • 8. Analyze, Evaluate and Respond:

Online Video

Decision making:  PDA platforms on both of these platforms

moving forward

 Cemented decision to develop a new homegrown

VOD system to replace our current one

 Facilitate designing the system  Specific user issues to guide instruction and

responses to questions from patrons

17

slide-18
SLIDE 18

Usability of Ebook Platforms

UNT Libraries, 2016-17

slide-19
SLIDE 19
  • 1. Identify a Need to Know:

Ebook Platforms

 PDA packages  Traditionally our largest platforms have been Ebsco,

Ebrary and EBL

 New ProQuest platform  We don't know enough about ebooks – why students

prefer them or don't, or even IF students prefer them at all

19

slide-20
SLIDE 20
  • 2. Literature Review: Ebook Platforms

 Usability testing of e-book platforms is needed to

provide more insight into issues with platforms and how people interact with them

 Preferences toward ebook vs print books are

currently being debated

 Learnability of ebooks vs print books is in question  Interactions (highlighting, note taking) with print books may be

preferred

20

slide-21
SLIDE 21
  • 3. Who do you want to know?

Ebook Platforms

 Undergraduates vs graduate students  Undergrads may have more issues  May have different preferences

21

slide-22
SLIDE 22
  • 4. What do You Want to Know?

Ebook Platforms

 What differences in the usability of e-books can be

  • bserved between graduate and undergraduate students?

 What are the specific usability issues that university

students encounter when using e-books?

 What is the correlation between technology acceptance

model constructs (perceived ease of use, perceived usefulness, attitude toward use, and behavioral intention) and usability metrics (efficiency, effectiveness and satisfaction)?

 Specific “sticking points”

22

slide-23
SLIDE 23
  • 5. Practical Considerations: Ebook Platforms

 Equipment  Laptops  Two moderators  Three days in January  Incentive: $20 Amazon cards

23

slide-24
SLIDE 24
  • 6. Prepare: Ebook Platforms

Usability Metrics Time on task Task success Subjective measures

24

slide-25
SLIDE 25
  • 6. Prepare: Ebook Platforms

 Tasks  Three specific-information searches  Specific interactions

 Download  Highlight  Bookmark  Share citations  Print

25

slide-26
SLIDE 26
  • 7. Make it Happen: Ebook Platforms

 All participants scheduled through outlook  Responded to flyers  Much easier process  6 undergrads and 6 grad students completed

≈12 hours of recording

26

slide-27
SLIDE 27
  • 8. Analyze, Evaluate and Respond

Ebook Platforms

Graduate & Undergraduate

 Graduate students

 More time on each task  Strong preference for EBSCO vs ProQuest Ebook Central

 Undergraduate students

 Strong preference for ProQuest Ebook Central  More time searching  Reading vs search box

All but one student stated they were MORE

likely to use ebooks

27

slide-28
SLIDE 28
  • 8. Analyze, Evaluate and Respond

Ebook Platforms

Sticking points

 Downloading (both

platforms)

“if I didn’t have

to do this for the assignment I would stop there and go find a PDF

  • nline” (UG)

28

slide-29
SLIDE 29
  • 8. Analyze, Evaluate and Respond

Ebook Platforms

Sticking points

 Creating accounts  Searching within a book (UGs)  Printing  Readability

29

slide-30
SLIDE 30
  • 8. Analyze, Evaluate and Respond

Ebook Platforms

 Specific user issues to guide instruction and

address questions from patrons

 General book navigation  Downloading/printing/checkout times  Effective search queries  Navigation icons  What is possible

Which platform to choose when

an ebook is available at the same price on both

30

slide-31
SLIDE 31

What Usability Can Tell Us

Determine website format Selecting electronic resources for

purchase/subscription

Prioritizing platforms Understand or predict usage rates Get to know our patrons Better understand information seeking behavior Address gaps in library instruction

31

slide-32
SLIDE 32

Selected Bibliography

Albertson, D. (2009). Analyzing user interaction with the ViewFinder video retrieval system. Journal of the American Society for Information Science and Technology, 61(2), 238-252.

Albertson, D., & Ju, B. (2015). Design criteria for video digital libraries: Categories of important features emerging from users'

  • responses. Online Information Review, 39(2), 214-228.

Barbier, J., P. Cevenini, and A. Crawford. Video Solves Key Challenges in Higher Education [White Paper]. San Jose, CA: Cisco, 2012.

Emanuel, Jennifer. "Usability Testing in Libraries: Methods, Limitations, and Implications." OCLC Systems and Services 29.4 (2013): 204-

  • 17. ProQuest Research Library. Web.Faulkner, Laura. "Beyond the Five-User Assumption: Benefits of Increased Sample Sizes in Usability

Testing." Behavior Research Methods, Instruments, & Computers 35.3 (2003): 379-83. MEDLINE. Web.

Foraker Labs. (2015). Usability first. Retrieved March 1, 2016, from www.usabilityfirst.com/glossary/ecological-validity/

Leonard, E. (2015). Great expectations: Students and video in higher education Sage.

Lin, Chin-Chao. "Exploring the Relationship between Technology Acceptance Model and Usability Test." Information Technology and Management 14.3 (2013): 243-55. Web.

Matusiak, K. K. (2013). Image and multimedia resources in an academic environment: A qualitative study of students' experiences and literacy practices. Journal of the American Society for Information Science and Technology, 64(8), 1577-1589.

Nielsen, Jakob. "How Many Test Users in a Usability Study?" Nielsen Norman Group Evidence-Based User Experience Research, Training, and Consulting. June 4, 2012. Web. September 26, 2016 <https://www.nngroup.com/articles/how-many-test-users/>.

Nielson, Janni, Torkil Clemmensen, and Carsten Yssing. "Getting Access to what Goes on in People's Heads?: Reflections on the Think- Aloud Technique". NordiCHI '02: Proceedings of the second Nordic conference on Human-computer interaction. 2002

Sauro, Jeff, and James R. Lewis. Quantifying the User Experience :Practical Statistics for User Research. Amsterdam; Boston: Elsevier/Morgan Kaufmann, 2012.

Tullis, T., & Albert, B. (2013). Measuring the user experience: Collecting, analyzing, and presenting usability metrics (2nd ed.). Waltham, MA: Elsevier, Inc.

32