Performance Metrics Project Megan Lee, October 2015 - - PowerPoint PPT Presentation

performance metrics project
SMART_READER_LITE
LIVE PREVIEW

Performance Metrics Project Megan Lee, October 2015 - - PowerPoint PPT Presentation

Monash Library Performance Metrics Project Megan Lee, October 2015 monash.edu/library monash.edu Which metrics have value? What is the Library trying to achieve Use metrics to measure how well were achieving our goals Use


slide-1
SLIDE 1

monash.edu

Monash Library Performance Metrics Project

monash.edu/library

Megan Lee, October 2015

slide-2
SLIDE 2

“Which metrics have value?”

slide-3
SLIDE 3

“What is the Library trying to achieve”

slide-4
SLIDE 4

Use metrics to measure how well we’re achieving our goals

slide-5
SLIDE 5

Use metrics to measure how well we’re achieving our goals

slide-6
SLIDE 6

We want to measure the OUTCOMES, VALUE or IMPACT

  • f the work
slide-7
SLIDE 7

The value of an academic library is complex, because the total value is composed of many separate values for each type of collection or service and because the value differs for different constituents and

  • ver time. (Tenopir 2013.)
slide-8
SLIDE 8

Libraries need to be [equally] deliberate and systematic in communicating the value proposition. …they need to be bold – sometimes even audacious – to convey an authentic belief that the information is important and that others want and need to know it. (Lewis 2015.)

slide-9
SLIDE 9

4:19

slide-10
SLIDE 10 Milestone/Deliverable Date / to be updated Consider environment and identify all strategic documents as a basis for developing library performance measurement matrix. April 2015 Establish Library performance metric data collection principals. April Finalise Library performance metric project plan. April Review Library strategic documents as a basis for developing performance measurement models. April Evaluate and accept appropriate performance measurement model. April Develop template of Library performance metrics for strategic goals and add draft metrics. April Solicit feedback on Library performance metrics template and draft performance metrics from Library Strategy Group input. April Identify capacity of Library systems to automate the generation of Library performance metrics. May / June Identify goals from section 4. The Library’s role, in the Library Annual Plan that are not included in the 2014 strategic initiatives. Write Actions / Hypothesis / Target audience and Performance Metrics for each, to be reported against in the Library annual report. Review capacity of University applications to provide data for comparison against Library performance metrics. Deferred Consider a cost/benefit analysis comparing University system metrics against Library metrics, eg map Aspire reading list unit penetration against Moodle unit breakdown to demonstrate Library impact on student learning. Deferred Review all proposed metrics and minimise manual data collection. June / July Develop and document process for extracting all required system reports July / August Develop recommendations for Library metrics collection and assign ownership of roles responsibilities at the individual level July / August Develop secure data location for manual and system generated metrics data July / August Set up processes to map collected Library performance metrics against Library strategies and
  • bjectives, for inclusion in the Library annual report.
July / August Develop calendar for release of mini reports that map collected Library performance metrics against Library strategies and objectives based on annual University events, eg report on number of exam prep sessions attended May / June for July that Research & Learning staff can use in discussion with academics prior to exams in 2nd semester. July / August Develop a process to create presentation tools / Library metrics packages (eg infographics, videos, case studies, stories) for staff to use in communicating the Library’s contribution to the University. July / August Develop eLearning (Captivate) walk-through to help stakeholders understand why, what, how, who and when Library performance metrics are collected and used. August Build staff engagement in the collection of performance metrics, as staff experience the value
  • f access to packaged data that demonstrate the impact of the library’s contributions to the
University. August Develop report of project outcomes and recommendations for submission to LMC, IRSC & ILFC. August
slide-11
SLIDE 11

Recognising achievements & Prioritise remaining work

slide-12
SLIDE 12

Recognise achievements to date & Prioritise remaining work

(Hampton, 2010)

slide-13
SLIDE 13

Initial understanding of the LPM project What’s been done? Critical evaluation: What’s left to be done? Current Understanding of the LPM Project Significant project actions to date

Reflective report development tool

slide-14
SLIDE 14

Reflective report development tool

slide-15
SLIDE 15

Initial understanding of the LPM project What’s been done? Critical evaluation: What’s left to be done? Current Understanding of the LPM Project Significant project actions to date

Reflective report development tool

slide-16
SLIDE 16
slide-17
SLIDE 17

Why do we decide we want to work on a project?

Developing Performance Metrics Action Strategic Initiative Hypothesis of success Performance Metrics Target Audience Partnering with faculty to ensure explicit development of information research and learning skills in the curriculum and to develop in-and extra-curricular resources and programs Design & implement Research & Learning Skills sessions, based

  • n the Research Skills Development framework, that broaden

student independent research and reporting capacities. By explicitly articulating a development path for achieving the skills and capacities of a mature researcher , within the context of the student’s units of study, the Library will escalate student research skills development

 Improved student performance in online learning assessment

modules (Captivate, Moodle analytics)

 Analysis of post session student feedback, solicited via survey

monkey (Survey monkey analytics)

 Monitor grade average of students who use the library, via

student self reported grade average, in Library user survey Looking Out: Students, Academic Staff, Professional staff, Looking In: Library Staff, Research & Learning Staff

slide-18
SLIDE 18
slide-19
SLIDE 19