Evaluation of software tools supporting outcomes- based continuous - - PowerPoint PPT Presentation

evaluation of software tools supporting outcomes based
SMART_READER_LITE
LIVE PREVIEW

Evaluation of software tools supporting outcomes- based continuous - - PowerPoint PPT Presentation

Evaluation of software tools supporting outcomes- based continuous program improvement processes: Part 3 Jake Kaupp & Brian Frank Episode iii I v e b e e n a t t h i s a w h i l e The promise Academic analytics combines select


slide-1
SLIDE 1

Evaluation of software tools supporting outcomes- based continuous program improvement processes: Part 3

Jake Kaupp & Brian Frank

slide-2
SLIDE 2

Episode iii

I ’ v e b e e n a t t h i s a w h i l e

slide-3
SLIDE 3

“Academic analytics combines select institutional

data, statistical analysis, and predictive modelling to create intelligence upon which students,

instructors, or administrators can change academic

behaviour”

The promise

slide-4
SLIDE 4

The pitfall

slide-5
SLIDE 5

“Why so much collection—but so little utilization—of data? Most institutions have routinized data collection, but they have

little experience in reviewing and making

sense of data.”

  • C. F. Blaich & Wise, 2011

The problem

slide-6
SLIDE 6
slide-7
SLIDE 7

Inform the community and raise awareness of available tools,

their strengths and limitations.

Encourage the use of effective practise in

conjunction with

educational technology

slide-8
SLIDE 8
slide-9
SLIDE 9
slide-10
SLIDE 10

Assessment Platform Designed according to effective practise Accommodates program & course assessment Manages process planning & curriculum improvement Portfolio encourages authentic assessment Templated accreditation reports 5!

slide-11
SLIDE 11

3! Analytics System Focus on student success & degree pathways Cloud-based platform Engagement & persistence are the main predicted elements Variable granularity for views & reporting Draws factors from diverse sources (LMS, SIS, etc.)

slide-12
SLIDE 12

4! Hybrid, closest to an analytics system Focuses on UX & flexibility Newcomer to education field Emphasis on controlled process with user & role-defined workflows Excel powered graphics Very customizable

slide-13
SLIDE 13

3! Curriculum mapping system Program & competency focused Multiple stakeholder views Encourages constructive alignment Flexible metadata mapping capabilities planned Reporting in development

slide-14
SLIDE 14

3! R: Statistical Computing Language RStudio: IDE for R One of the fastest growing languages for data analysis, analytics & visualization Extensible, modular, framework based Excellent for customized, need- based and ad-hoc reporting Tidy Transform Visualize Model Pipeline for data analysis [1]

[1] Wickham, H. (2015). Big data pipelines. Presented at the Workshop
  • n Visualization for Big Data Strategies and Principles, Toronto.
slide-15
SLIDE 15

Graduate Attribute Course Report: MECH 216

Jake Kaupp May 22nd, 2015 Course Mapping Tables Course Indicator Short Description Assessment Assessor Date Assessed Instructor Comments Number of Students Assessed MECH 216 APSC-2- IN-2 Data Acquisition lab reports TA NA NA 163 MECH 216 APSC-2- IN-5 Uncertainty lab reports TA NA NA 163 MECH 216 APSC-2- IN-6 Draw Conclusions lab reports TA NA NA 163 MECH 216 APSC-2- CO-3 Write Clearly lab reports TA NA NA 163 MECH 216 APSC-2- CO-6 Technical Graphics lab reports TA NA NA 163

Performance Histograms

2014−2015 25 50 75 100 25 50 75 100 25 50 75 100 25 50 75 100 25 50 75 100 IN IN IN CO CO Data Acquisition Draw Conclusions Uncertainty Technical Graphics Write Clearly Not Demonstrated Marginal Meets Expectations High Quality Mastery Performance Count

Templated Reporting

slide-16
SLIDE 16

Dashboard Style Information

2012−2013 2013−2014 100 200 300 400 500 100 200 300 400 500 100 200 300 400 500 100 200 300 400 500 100 200 300 400 500 100 200 300 400 500 CO DE LL PA PA PA APSC−1−CO−3 APSC−1−DE−5 APSC−1−LL−6 APSC−1−PA−4 APSC−1−PA−7 APSC−1−PA−8 Technical writing Convergent Self−regulation Modelling process Model evaluation Evidence based conclusions Not Demonstrated Marginal Meets Expectations High Quality Mastery Not Demonstrated Marginal Meets Expectations High Quality Mastery Performance Count

Historical

2013−2014 Not Demonstrated Marginal Meets Expectations High Quality Mastery Not Demonstrated Marginal Meets Expectations High Quality Mastery Not Demonstrated Marginal Meets Expectations High Quality Mastery Not Demonstrated Marginal Meets Expectations High Quality Mastery First Year Second Year Third Year Fourth Year PA DE CO LL

Attribute

Program Overview

slide-17
SLIDE 17
slide-18
SLIDE 18
slide-19
SLIDE 19

Episode iv

Maybe Blackboard will speak with me this time

slide-20
SLIDE 20