Institutional Learning Outcomes Information Literacy Pilot - - PowerPoint PPT Presentation

institutional learning outcomes information literacy
SMART_READER_LITE
LIVE PREVIEW

Institutional Learning Outcomes Information Literacy Pilot - - PowerPoint PPT Presentation

Institutional Learning Outcomes Information Literacy Pilot Assessment Project Presented to ILO Subcommittee February 5 th , 2018 2014-15 l 2017-18 Fanny Yeung, Institutional Research, Analysis, and Decision Support Julie Stein, Educational


slide-1
SLIDE 1

Institutional Learning Outcomes Information Literacy Pilot Assessment Project Presented to ILO Subcommittee February 5th, 2018

2014-15 l 2017-18 Fanny Yeung, Institutional Research, Analysis, and Decision Support Julie Stein, Educational Effectiveness, APS

slide-2
SLIDE 2

ILO Inf. Literacy Rubric Developed 2014-15 FLC Did Not Include all Colleges (COS, CBE)

slide-3
SLIDE 3

ILO Inf. Literacy Rubric Further Developed, Winter, 2017 Faculty Included From All Colleges

Information Literacy Rubric Development Faculty

First Name Last Name College Department Stephanie Alexander Library Project Lead, University Libraries Stephanie Seitz CBE Management Deepika Mathur CLASS COS Human Development & Health Sciences Craig Derksen CLASS Philosophy Tom Bickley Library University Libraries Matt Atencio CEAS Kinesiology Doc Matsuda CLASS Anthropology 3

slide-4
SLIDE 4

ILO Inf. Literacy Rubric Winter/Spring 2017

slide-5
SLIDE 5

ILO Inf. Literacy Rubric Changes 2014-15 2017

new new

slide-6
SLIDE 6

ILO Inf. Literacy Rubric Piloted, Spring 2017

Information Literacy Rubric Application Faculty First Name Last Name College Department Jean Moran COS Earth & Environmental Sciences Doc Matsuda CLASS Anthropology Deepika Mathur CLASS COS Human Development & Health Sciences Matt Atencio CEAS Kinesiology Jeff Newcomb CBE Marketing & Entr. Ben Klein CLASS History Becky Beal CEAS Kinesiology Rahima Gates COS Health Sciences

6

slide-7
SLIDE 7

ILO Inf. Literacy Pilot Spring 2017 Assessment Results

slide-8
SLIDE 8

Spring 2017 ILO Inf. Lit. Assessment Results

slide-9
SLIDE 9

Individual Scores

  • vs. Average Scores

BB Individual Scores (n= 149 reviews) Individual Scores (n=120 reviews) Average Scores (n=60 artifacts) Scope 3.30 3.23 3.23 Gather 2.87 2.77 2.77 Evaluate 2.85 2.71 2.71 Synthesize 3.07 2.98 2.98 Communicate 3.19 3.09 3.09 Attribute 3.06 3.03 3.03

*Note: BB scores are automated prior to data verification and cleaning. Additionally, scores reflects all artifacts and assessments (i.e., graduate courses) from collection.

slide-10
SLIDE 10

12 3 6 17 13 1 17 22 28 33 36 20 47 56 49 38 37 49 44 39 37 32 34 50 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Attribute Communicate Synthesize Evaluate Gather Scope

CSUEB Information Literacy Assessment, Spring 2017 (n=120)

Rating=1 Rating=2 Rating=3 Rating=4

slide-11
SLIDE 11

2.83 2.89 2.89 3.22 3.72 3.50 3.44

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9)

Scope: Identification of Question/Concept/Problem (n=60)

Course Mean Institutional Mean (3.23) Competent Rubric Score (3)

slide-12
SLIDE 12

2.00 2.33 2.39 3.00 3.67 2.61 3.11

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9)

Gather: Use of Search Strategies (n=60)

Course Mean Institutional Mean (2.77) Competent Rubric Score (3)

slide-13
SLIDE 13

1.83 2.22 2.33 2.83 3.56 2.89 3.00

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9)

Evaluation of Sources (n=60)

Course Mean Institutional Mean (2.71) Competent Rubric Score (3)

slide-14
SLIDE 14

2.58 2.50 3.00 2.67 3.50 3.33 3.11

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9)

Synthesize: Connections among Sources (n=60)

Course Mean Institutional Mean (2.98) Competent Rubric Score (3)

slide-15
SLIDE 15

2.75 2.67 3.17 2.94 3.67 3.17 3.17

0.00 0.50 1.00 1.50 2.00 2.50 3.00 3.50 4.00 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9)

Communicate: Knowledge and Use of Disciplinary Approaches (n=60)

Course Mean Institutional Mean (3.09) Competent Rubric Score (3)

slide-16
SLIDE 16

1.92 2.28 3.17 3.17 3.83 3.11 3.33

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 Course 1 (n=6) Course 2 (n=9) Course 4 (n=9) Course 5 (n=9) Course 6 (n=9) Course 7 (n=9) Course 8 (n=9)

Attribute: Effective, Ethical, and Legal Use of Attribution (n=60)

Course Mean Institutional Mean (3.30) Competent Rubric Score (3)

slide-17
SLIDE 17

33 28 25 24 28 31 22 20 29 26 27 22 5 12 6 9 5 6 1 1

5 10 15 20 25 30 35 Scope Gather Evaluate Synthesize Communicate Attribute

Information Literacy ILO Spring 2017: Rater Consistency across Domains

0 pt difference 1 pt difference 2 pt difference 3 pt difference

slide-18
SLIDE 18

Faculty Feedback on Changes to Rubric

Faculty #1: “No suggestions to clarify or change the rubric other than to infuse it as you see fit to speak to your assignment's main goals.” Faculty #2: “The Rubric was a good fit for evaluating the project assignment. A vital part of the project assignment required critical thinking for analyses, conclusions and takeaways based on information search. I used the Rubric’s criteria for Synthesize and Communicate to evaluate students’ critical thinking. My one recommendation for further refinement of the Rubric would be to transform the Criteria and descriptions using student-friendly language, to help with students’ deeper understanding of requirements and possibilities.” Faculty #3 “The rubric works for general assessment of that skill, but for giving feedback I find it best to be more specific to their work.”

slide-19
SLIDE 19

Institutional Learning Outcomes Information Literacy Pilot Assessment Project Discussion & Questions

  • What changes should be made to the Information Literacy rubric

and/or assessment process to improve ILO assessment?

  • Were there any challenges in assessing the “Evaluate,” “Synthesize,”,

and/or “Communicate” rubric domains?

  • The graduate course was excluded from analysis. How should

graduate ILO assessments be structured in the future?