Todays webinar Getting started: Whos doing what and why you should - - PowerPoint PPT Presentation

today s webinar
SMART_READER_LITE
LIVE PREVIEW

Todays webinar Getting started: Whos doing what and why you should - - PowerPoint PPT Presentation

Todays webinar Getting started: Whos doing what and why you should care Introduces you to the basics and paves the way for learning how to create and implement assessment tools at your institution. 1 Informing the Future of Higher


slide-1
SLIDE 1

Today’s webinar

Getting started: Who’s doing what and why you should care

Introduces you to the basics and paves the way for learning how to create and implement assessment tools at your institution.

1

Informing the Future of Higher Education

slide-2
SLIDE 2

Meet today’s experts

2

Informing the Future of Higher Education

  • Dr. Natasha Jankowski is

Associate Director of the National Institute for Learning Outcomes Assessment and Research Assistant Professor with the Department of Education, Policy, Organization and Leadership at the University

  • f Illinois Urbana-Champaign.

njankow2@illinois Gary Kapelus is the chair of the Office of Academic Excellence at George Brown College. gkapelus@georgebrown.ca Brian Frank is an associate professor in the Department

  • f Electrical and Computer

Engineering, the DuPont Canada Chair in Engineering Education Research and Development, and the Director of Program Development in the Faculty

  • f Engineering and Applied

Science. brian.frank@queensu.ca

slide-3
SLIDE 3

Learning Outcomes Assessment: A Brief Overview

NATASHA JANKOWSKI: NJANKOW2@ILLINOIS.EDU NATIONAL INSTITUTE FOR LEARNING OUTCOMES ASSESSMENT

slide-4
SLIDE 4

NILOA

NILOA’s mission is to discover and disseminate effective use

  • f assessment data to strengthen undergraduate education

and support institutions in their assessment efforts.

  • SURVEYS ● WEB SCANS ● CASE STUDIES ● FOCUS GROUPS
  • OCCASIONAL PAPERS ● WEBSITE ● RESOURCES ● NEWSLETTER ●

PRESENTATIONS ● TRANSPARENCY FRAMEWORK ● FEATURED WEBSITES ● ACCREDITATION RESOURCES ● ASSESSMENT EVENT CALENDAR ● ASSESSMENT NEWS ● MEASURING QUALITY INVENTORY ● POLICY ANALYSIS ● ENVIRONMENTAL SCAN ● DEGREE QUALIFICATIONS PROFILE ● TUNING

www.learningoutcomesassessment.org

slide-5
SLIDE 5
slide-6
SLIDE 6

What are learning outcomes?

They are verb driven statements about student learning that also signal what we value in education

slide-7
SLIDE 7

Purpose

But why do we do assessment? And why do we do it now?

slide-8
SLIDE 8

Institutions of higher education are increasingly asked to show the value

  • f attending, i.e. impact in relation to cost; employment

Public and policy makers want assurance of the quality of higher education Regional accreditors are asking institutions to show evidence of student learning and instances of use Improvement of teaching and learning and enhanced transparency and saliency of education for students

Value

slide-9
SLIDE 9

Used to Answer Various Educational Questions

Quality Assurance Improve educational quality Curriculum effectiveness Employer needs

  • American Association of Colleges &

Universities employer survey

Cost containment Student mobility Expanding educational providers Value-added What does it mean to attain a degree?

slide-10
SLIDE 10

2013 National Provost Survey

Sample: All regionally accredited, undergraduate degree-granting institutions (n=2,732) Announced via institutional membership organizations, website, newsletter, mailing Online and paper 43% response rate (n=1,202) 725 schools participated in both 2009 & 2013

slide-11
SLIDE 11

http://www.learningoutcomeassessment.org/knowingwhatstudentsknowandcando.html

slide-12
SLIDE 12

Assessment Tools

20 40 60 80 100

Other External Portfolios Employer surveys General knowledge/skills Locally developed tests Capstones Locally developed surveys Placement exams Alumni surveys Classroom-based Rubrics National Student Surveys

Percentage of Institutions

slide-13
SLIDE 13

Change Over Time

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Percentage of Institutions 2009 2013

slide-14
SLIDE 14

Most Valuable Assessment Approaches

The top three…  Classroom-based assessment  National Student Surveys  Rubrics

slide-15
SLIDE 15

Connecting various levels at which assessment occurs Undertaking meaningful assessment Making it institution-wide Involving multiple campus constituents and students

Challenges

slide-16
SLIDE 16

HEQCO Webinar March 30, 2015 Measuring Matters – Assessing Learning Outcomes in Higher Education Webinar #1: Getting Started: Who’s doing what and why you should care Gary Kapelus, George Brown College, Panelist gkapelus@georgebrown.ca 416.415.5000 x3508

slide-17
SLIDE 17

Ontario Framework for Programs of Instruction

  • Provincial Program Standards - includes ‘vocational learning outcomes’ (VLOs) and

‘elements of performance’

  • Program Descriptions /goals
  • Essential Employability Skills (EES)

Provincial ‘community of practice’ and resources

  • Curriculum Developers Affinity Group/The Exchange (CDAG)

http://gototheexchange.ca/index.php/curriculum-at-a-program-level/program-learning-outcomes

Context – Ontario’s Community Colleges

2

slide-18
SLIDE 18

http://www.tcu.gov.on.ca/pepg/audiences/colleges/progstan/health/nurse.pdf

3

Provincial Program Standards

slide-19
SLIDE 19

Essential Employability Skills

http://www.tcu.gov.on.ca/pepg/audiences/colleges/progstan/essential.html

4 Skill Category Defining Skills Skill areas to be demonstrated by graduates Learning Outcomes: The levels of achievement required by graduates

Communication Reading, writing, speaking,

listening, presenting, visual literacy

  • 1. Communicate clearly, concisely and correctly in the

written, spoken, and visual form that fulfills the purpose and meets the needs of the audience.

  • 2. Respond to written, spoken, or visual messages in a

manner that ensures effective communication.

Numeracy

Understanding and applying mathematical concepts and reasoning, analyzing and using numerical data, conceptualizing

  • 3. Execute mathematical operations accurately.

Critical thinking and problem solving

Analyzing, synthesizing, evaluating, decision-making, creative and innovative thinking

  • 4. Apply a systematic approach to solve problems.
  • 5. Use a variety of thinking skills to anticipate and solve

problems.

slide-20
SLIDE 20

Essential Employability Skills

5

http://www.tcu.gov.on.ca/pepg/audiences/colleges/progstan/essential.html

Skill Category Defining Skills Skill areas to be demonstrated by graduates Learning Outcomes: The levels of achievement required by graduates

Information Management

Gathering and managing information, selecting and using appropriate tools and technology for a task or a project, computer literacy, internet skills

  • 6. Locate, select, organize, and document information using

appropriate technology and information systems.

  • 7. Analyze, evaluate, and apply relevant information from a

variety of sources.

Interpersonal

team work, relationship management, conflict resolution, leadership, networking

  • 8. Show respect for diverse opinions, values belief systems,

and contributions of others.

  • 9. Interact with others in groups or teams in ways that

contribute to effective working relationships and the achievement of goals.

Personal

managing self, managing change and being flexible and adaptable, engaging in reflective practices, demonstrating personal responsibility

  • 10. Manage the use of time and other resources to

complete projects.

  • 11. Take responsibility for one’s own actions, decisions, and

consequences.

slide-21
SLIDE 21

Accountability – alignment with provincial learning outcomes

  • New programs: map course descriptions to the VLOs and EESs; approved by

the Credential Validation Service (CVS)

  • Periodically re-map course-specific learning outcomes to the program VLOs

and EESs

  • Autonomy in assessing learning outcomes (benefits/challenges)

Context – Ontario’s Community Colleges

6

slide-22
SLIDE 22

New in 2015: Ontario college accreditation program (OCQAS)

  • Learning outcomes reflected in 13/33 accreditation requirements
  • Must demonstrate how learning outcomes are reflected in the development of

both learning activities and assessments

Context – Ontario’s Community Colleges

OCQAS Accreditation standards http://ocqas.org/?page_id=9277

7

slide-23
SLIDE 23
  • Assumes colleges are practicing outcome-based learning at the course and

program level

  • The credit-granting system is still ‘hours- and course-based’ rather than

‘outcomes-based’

  • Must incorporate and integrate various mandatory provincial program-level

learning outcomes

  • Some LO statements are outdated,
  • Some LO statements don’t lead easily to measurement,
  • Consistency of assessment across the province for the same program?
  • Practical considerations – authenticity, validity, triangulation, integration,

resources

  • EES are seen by some as less important in curriculum design/delivery

Challenges in assessment of LOs

8

slide-24
SLIDE 24

Has the essential employability skill being assessed actually been taught/practiced?

  • Our LOAC project focused on developing a validated tool to measure

critical thinking (one of the mandatory EESs)

  • We discovered that faculty in our project were not necessarily teaching

critical thinking skills but still expected students to demonstrate these skills in graded assignments

  • We worked with faculty to incorporate and make explicit the learning and

practicing of critical thinking skills into the core curriculum, so that there was something tangible to assess

  • This is a common challenge with assessing EESs

Case study: measuring EES

http://www.heqco.ca/SiteCollectionDocuments/LOAC-GBC.pdf

9

slide-25
SLIDE 25

Learning Outcomes: Why and so what?

Brian Frank, Queen’s University brian.frank@queensu.ca

slide-26
SLIDE 26

Why learning outcomes?

  • Assessing and improving quality of learning
  • Curriculum development
  • Space planning
  • Student services and academic support

planning Responding to needs including…

  • Pressure for accountability
  • Mobility, credit transfer, “unbundling”
  • Multiple modes of delivery
slide-27
SLIDE 27

Value of identifying learning outcomes

Hattie, J. (2009). The Black Box of Tertiary Assessment: An Impending Revolution. In L. H. Meyer, S. Davidson, H. Anderson, R. Fletcher, P.M. Johnston, & M. Rees (Eds.), Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research (pp.259-275). Wellington, New Zealand: Ako Aotearoa

A study synthesizing:

800 meta-analyses 50,000+ studies 200+ million students

found that explicit outcomes and assessment has one of the largest effects on learning…

slide-28
SLIDE 28

0.2 0.4 0.6 0.8 1 1.2 1.4 Student self-assessment Formative evalution to instructor Explicit objectives and assessment Reciprocal teaching Feedback Spaced vs. mass practice Metacognitive strategies Creativity programs Self-questioning Professional development Problem solving teaching … Teaching quality Time on task Computer assisted instruction

Effect size (performance gain in σ)

800 meta-analyses 50,000+ studies 200+ million students

Hattie, J. (2009). The Black Box of Tertiary Assessment: An Impending

  • Revolution. In Tertiary Assessment

& Higher Education Student Outcomes: Policy, Practice & Research (pp.259-275). Wellington, New Zealand: Ako Aotearoa

slide-29
SLIDE 29

Continuous Improvement Process

Learning environment Program & course curriculum maps Learning outcomes Assessment Analysis & Evaluation

Design Support Delivery

Course'1' Course'2' Outcome'1' X" Outcome'2' X"

Program wide In- course

Continuous program improvement cycle

Program wide In- course

slide-30
SLIDE 30

Curricular development

slide-31
SLIDE 31

31

slide-32
SLIDE 32

Student development

32

slide-33
SLIDE 33

Exiting 4th Year means from all participating 4-Year Colleges and Universities Queen’s University 1st Year (μ=1169) n=546 Queen’s University 4th Year (μ=1258) n=41

Benchmarking

1st Year- 90th Percentile 4th Year- 98th Percentile

slide-34
SLIDE 34

Visualizing the curriculum

34

First Year Curriculum Treemap, Area = # of assessments per attribute

# of assessments per indicator

0.0 0.5 1.0 1.5 2.0 2.5 3.0

CO DE EC EE ET IM IN KB LL PA PR TW

slide-35
SLIDE 35

Visualizing the curriculum

35

First Year Curriculum Treemap, Area = # of assessments per attribute

# of assessments per indicator

0.0 0.5 1.0 1.5 2.0 2.5 3.0

CO DE EC EE ET IM IN KB LL PA PR TW

Problem analysis

Area=# learning outcomes Colour=# assessments

slide-36
SLIDE 36

Visualizing the curriculum

36

First Year Curriculum Treemap, Area = # of assessments per attribute

# of assessments per indicator

0.0 0.5 1.0 1.5 2.0 2.5 3.0

CO DE EC EE ET IM IN KB LL PA PR TW

Ethics and equity

Area=# learning outcomes Colour=# assessments

slide-37
SLIDE 37

Priority 1: Resources

First year Second year Third year Fourth year

Year

slide-38
SLIDE 38

Priority 1: Resources

First year Second year Third year Fourth year

Year

slide-39
SLIDE 39

Priority 1: Resources

First year Second year Third year Fourth year

Year

slide-40
SLIDE 40

Priority 1: Resources

First year Second year Third year Fourth year

Year

slide-41
SLIDE 41

Can we trust our data? Triangulation

slide-42
SLIDE 42

Impact on teaching

Encourage a culture of thinking about learning goals, measuring, and making improvement Encourage discussions about teaching at institutions and within departments

42

slide-43
SLIDE 43

Tradeoffs in assessment

Authenticity vs. Cost Reliability vs. Cost Standardized vs. motivation benchmarkable & time

43

slide-44
SLIDE 44

Why not use grades to assess outcomes?

44

Electric Circuits I Electromagnetics I Signals and Systems I Electronics I Electrical Engineering Laboratory Engineering Communications Engineering Economics ... Electrical Design Capstone 78 56 82 71 86 76 88 86

Student transcript

How well does the program prepare students to solve open-ended problems? Are students prepared to continue learning independently after graduation? Do students consider the social and environmental implications of their work? What can students do with Knowledge? Can they communicate effectively? Course grades usually aggregate assessment of multiple objectives, and are indirect evidence for some expectations

slide-45
SLIDE 45

45

Student: You are here! (67%) Norm referenced evaluation Grades Criterion referenced evaluation

Used for large scale evaluation to compare students against each other

“Student has marginally met expectations because submitted work mentions social, environmental, and legal factors in design process but no clear evidence of that these factors impacted on decision making.”

Used to evaluate students against stated

  • criteria. Useful for feedback to student

and conversation within a program

slide-46
SLIDE 46

brian.frank@queensu.ca

slide-47
SLIDE 47

Thank you to today’s experts

47

Informing the Future of Higher Education

  • Dr. Natasha Jankowski is

Associate Director of the National Institute for Learning Outcomes Assessment and Research Assistant Professor with the Department of Education, Policy, Organization and Leadership at the University

  • f Illinois Urbana-Champaign.

njankow2@illinois Gary Kapelus is the chair of the Office of Academic Excellence at George Brown College. gkapelus@georgebrown.ca Brian Frank is an associate professor in the Department

  • f Electrical and Computer

Engineering, the DuPont Canada Chair in Engineering Education Research and Development, and the Director of Program Development in the Faculty

  • f Engineering and Applied

Science. brian.frank@queensu.ca

slide-48
SLIDE 48

Save the dates for our next webinars!

Webinar 2, April 30, 2015 Common ground: The language of learning outcomes Before beginning to assess learning outcomes, we need to decide what skills are to be assessed and clearly describe successful skill development. The second webinar explores the importance of terminology and the value of creating a common language when designing and assessing learning outcomes. Webinar 3, May 28, 2015 Building a better toolkit Armed with the learning outcomes big picture and a common language, you’re ready to choose and develop the tools to assess students’ achievement of learning

  • utcomes. The third webinar will help you set smart parameters for your learning
  • utcomes assessment project.

48

Informing the Future of Higher Education

slide-49
SLIDE 49

Check out our helpful resources

49

Informing the Future of Higher Education

A step-by-step resource to help faculty, staff, academic leaders and educational developers design, review and assess program-level learning outcomes We invite you to submit an RFP to join our Learning Outcomes Assessment Consortium Expansion (Universities and Colleges)

slide-50
SLIDE 50

And learn more at heqco.ca

50

Informing the Future of Higher Education

Colleagues couldn’t make it? Our webinars will be posted on our website shortly. Stay tuned!