Cultivating a Culture of Assessment in a Time of Transition 1 Dr. - - PowerPoint PPT Presentation

cultivating a culture of assessment in a
SMART_READER_LITE
LIVE PREVIEW

Cultivating a Culture of Assessment in a Time of Transition 1 Dr. - - PowerPoint PPT Presentation

Cultivating a Culture of Assessment in a Time of Transition 1 Dr. Laura Senz, Associate Vice President for Academic and Institutional Excellence Dr. Carlos Cuellar, Director of Institutional Assessment TxAHEA Conference, October 5, 2018 ,


slide-1
SLIDE 1

Cultivating a Culture of Assessment in a Time of Transition

  • Dr. Laura Sáenz, Associate Vice President for Academic and Institutional

Excellence

  • Dr. Carlos Cuellar, Director of Institutional Assessment

TxAHEA Conference, October 5, 2018 , University North Texas, Denton, TX

1

slide-2
SLIDE 2

Outline of the Presentation

  • Welcome and Introductions
  • Historical Background on UTRGV
  • Major Phases of the Development of the UTRGV Assessment

Process and Office

  • Lessons learned
  • Continuous Improvement Efforts

2

slide-3
SLIDE 3

Welcome and Introductions

  • Dr. Laura Saenz – Associate Vice President of Academic and

Institutional Excellence in the Division of Academic Affairs, Student Success and P-16 Integration

  • 20 years in higher education – tenured in the College of Education at

UTPA

  • 10 years in higher education administration (college and university at

UTPA)

  • Undergraduate studies, teaching, assessment and accreditation

3

slide-4
SLIDE 4

Welcome and Introductions

  • Dr. Carlos Cuellar – Director of Institutional Assessment
  • 7 years experience in higher education
  • Previously lecturer in Dept. of Politics and International Affairs at NAU (Flagstaff, AZ)
  • Currently part-time faculty in Political Science Dept. at UTRGV
  • 3 years in higher education administration
  • Blank slate for UTRGV’s new assessment office (lots of reading required!)
  • Helped develop UTRGV’s assessment reporting framework and support infrastructure

4 7 Years in Higher Ed 3 Years in Higher Ed Administration

slide-5
SLIDE 5

The UTRGV Story

  • In 2013, the Texas Legislature recognized the need to expand

education opportunities in the region and created The University of Texas Rio Grande Valley (UTRGV) from UTPA and UTB.

  • Prior to the fall of 2015, The University of Texas Pan American (UTPA)

and The University of Texas Brownsville (UTB) were two distinct Texas-Mexico border universities belonging to the UT System.

  • In the fall of 2015, UTRGV enrolled its first class.
  • In the summer of 2016, the School of Medicine welcomed its first

class.

5

slide-6
SLIDE 6

6

slide-7
SLIDE 7

Campuses and Facilities

7

slide-8
SLIDE 8

Three Major Phases of Assessment Development

  • Phase I: Preparation Years – 2013-14; 2014-15
  • Phase II: Assessment Transition Period – 2015-16; 2016-17
  • Phase III: New Assessment Period – 2017-18; 2018 to present

8

slide-9
SLIDE 9

Phase I: Preparing for UTRGV

  • Phase I: Preparation Years – 2013-14; 2014-15
  • 2013-2014 - campus leaders formed work groups from faculty, staff,

administrators and students of its legacy institutions.

  • The work groups were with envisioning the new University’s mission, structure,

programs, operations and much more.

  • 2014-2015 – unify

ifyin ing programs, operations and services

9

slide-10
SLIDE 10

2013-14: Joint Work Groups

10

2013-2014 Work Groups Academic Administration Academic Assessment SLO, Core & Program Review Institutional Effectiveness & Accreditation SACSCOC, Non- SLO assessment, IR Reporting Operations Administration Academic Programs

slide-11
SLIDE 11

Academic Assessment Work Group Process

Develop Guiding Philosophy Situation Analysis Research best practices Consultations with National Leaders Key Recommendations

11

slide-12
SLIDE 12

Guiding Philosophy

  • The Philosophy guiding the work of the Academic Assessment

Work Group was based around the goal l of f cr creatin ing an authentic ic cu cult lture of evid idence-based dec ecis isio ion-makin ing.

  • Work Group agreed that Academic Assessment must be

assigned greater valu lue and giv iven prio iorit ity attentio ion at t UTR TRGV. 12

slide-13
SLIDE 13

Thirteen Recommendations

  • 1.

. Divi vision of

  • f Academic

ic Affairs

  • 2. Academic Assessment Council
  • 3.

. Adm dmin inistrators com

  • mmunic

icate impo portance

  • 4.

. Crea eate one

  • ne-stop ass

assessment t sho shop

  • 5.

. Del elin ineate role

  • les and

and resp esponsibil ilit itie ies for

  • r ass

assessment

  • 6.

. Ass ssocia iate Dea eans for

  • r Asse

ssessment

  • 7. Incentivize assessment
  • 8.

. Cor

  • re cu

curr rric iculum com

  • mmit

ittee

  • 9. SLO review committee
  • 10. Program review oversight

committee

  • 11.

. In Invest or

  • r de

develop an an AMS

  • 12.

. SLO in n co-curric icula lar r ar areas

  • 13.

. Facult lty Develo lopment t

  • 14. Healthy budget

13

slide-14
SLIDE 14

Recommended Organizational Chart for Academic Assessment

14

Assistant Provost

  • f Assessment

Research Analyst Research Analyst IT Expert Assessment Specialist Edinburg Assessment Specialist Brownsville College Associate Dean of Assessment Department Assessment Liaison

One Stop Shop: Center for Academic Assessment

slide-15
SLIDE 15

2014-15: Unifying Programs & Organizational Structures

UTPA Program UTB Program

UTRGV Program

UTPA Operation & Service UTB Operation & Service

UTRGV Operation & Service

15

slide-16
SLIDE 16

Phase II: Transition Period - Overview

  • Phase II: Transition Period – 2015-16 & 2016-17
  • UTRGV multi-campus structure
  • Consolidated educational programs to be offered in parallel at all

campus locations

  • Entirely new institutional organizational chart
  • Lots of excitement, anxiety and buzz in the local news!

16

slide-17
SLIDE 17

Goals of Transition Assessment Period Starting Fall 2015

Fun Functio ional

  • Develop and market an identity for the

assessment office

  • Jump start the institutional assessment

process

  • Two years to prepare for decennial

reaffirmation

  • Rea

eaffirm rmati tion due e Se

  • Sept. 2017
  • No assessment planning leading up to fall 2015
  • Implement an interim assessment

management system

  • Review and recommend a permanent

assessment management system

Cult ulture of

  • f Con
  • ntin

inuous Im Impr provement

  • Develop resources and materials
  • Website, How-to’s, FAQs, examples, etc.
  • Develop shared understandings
  • Provide planned and on-demand training
  • Develop a network of assessment

contacts/leads

  • Define roles and responsibilities
  • Seek input and work collaboratively with leads
  • Develop a communication plan
  • Develop and implement an assessment

feedback process

17

slide-18
SLIDE 18

Getting Started: Opening the “New” Assessment Office

18

slide-19
SLIDE 19

Organizational Chart

AVP Accreditation & Assessment Director of Institutional Assessment Assessment Coordinators (Brownsville) Assessment Coordinators (Edinburg) Accreditation Coordinator Research Analyst Associate Deans for Assessment

19

Areas of Responsibility:

  • Educational Program-

SLO

  • Administrative Support
  • Academic & Student

Support

  • General Education*
  • Assessment

Management System

  • Program Review
  • Specialized

Accreditation

  • QEP
  • SACSCOC*
slide-20
SLIDE 20

Advantages and Challenges of the “New” Assessment Office

Advantages

  • Opportunity to “reset” the assessment

process and improve on legacy practices

  • Dedicated assessment leaders in the

academic colleges

  • Dedicated assessment staff on both

campuses

Challe llenges

  • Messaging all came from the

Assessment Office and staff

  • No assessment “champion” at the

institutional level

  • Experienced and inexperienced staff
  • Number not as important as experience

20

slide-21
SLIDE 21

Jumpstarting Assessment at UTRGV

21

slide-22
SLIDE 22

Introducing the Future Plans for a New Assessment Framework

22

slide-23
SLIDE 23

Starting with Administrative Unit Assessment

23

slide-24
SLIDE 24

Defining Roles and Responsibilities

24

slide-25
SLIDE 25

Transitional Assessment Process

25

slide-26
SLIDE 26

Collecting Assessment Plans in Transitional AMS - SharePoint

26

slide-27
SLIDE 27

Implementing Meta-Assessment

27

slide-28
SLIDE 28

Capturing Improvements from Transitional Period 15-17

28

slide-29
SLIDE 29

Advantages & Challenges of the Transitional Assessment Process

Adv dvantages

  • New process resulted in a mindset
  • f support among in assessment
  • ffice staff
  • Provided information regarding

areas of need among faculty and staff

  • Provided an opportunity to build a

process with input from assessment leaders

Cha hall llenges

  • Conflicting legacy practices
  • New leaders and faculty with limited

experience

  • No master list of programs and units
  • Templates not intuitive
  • Developing and maintaining records
  • f AMS access and privileges

29

slide-30
SLIDE 30

Adopting a New AMS – Tk20

  • Acquired fall 2016, but

postponed implementation until the end of the transition assessment period

  • Fall 2016 to spring 2017
  • Implementation planning with

Tk20 staff

  • Product training

30

slide-31
SLIDE 31

Phase III: New Assessment Period – Overview

  • Phase III: New Assessment Period – 2017-18; 2018 to present
  • Implementation of new assessment framework
  • Development or resources
  • Development of network of assessment leaders
  • Implementation of meta-assessment process through college-level

assessment activities

31

slide-32
SLIDE 32

Priorities

32

  • Sup

Support

  • rt. Develop institutional infrastructure for supporting

programs and units

  • Com
  • mplia
  • liance. Ensure that a critical mass of programs and

units participate in institutional assessment process

  • Qua

uali

  • lity. Improve the quality of assessment activities and

the reporting of those activities

slide-33
SLIDE 33

Step 1: Developing a New Framework

  • Identify major challenges with previous framework
  • Research best practices
  • Create a unique process to inc

increase buy buy-in in and red educe res esis istance from assessment skeptics while complying with SACSCOC expectations 33

slide-34
SLIDE 34

Framework Overview and Justification

34

Provid ide more tim time and fle flexibil ilit ity for

  • r:
  • Planning
  • Data collection
  • Using results to seek

improvements

  • Reflecting on lessons learned

Align lign to

  • SACSCOC Reportin

ing tim timelin ine

slide-35
SLIDE 35

Step 2: Preparing Faculty/Staff for New Framework (fall 2016)

35

  • Update our contact li

list!!!

52 Administrative

Support Service Units

137 Educational

Degree Programs

28 College and

Divisional Area Assessment Liaisons

279 Assessment

Coordinators in all areas of assessment

Our r Constitu nstituen ents ts

90 Academic and

Student Support Units

slide-36
SLIDE 36

Preparing Faculty/Staff for New Framework – Cont’d

  • 14

14 - Presentations

  • 178

178 - Attendees

36

(Sept. 22 – Nov. 15, 2016)

  • Description
  • Justification
  • FAQs
  • Examples/Visuals
  • Timeline
slide-37
SLIDE 37

Framework’s Advantages and Challenges Advantages

  • Educ

ducat ational l Pro rogr grams s re respo sponded well ell to fle flexi xibil ility & lo long ng-term plann plannin ing

  • SLOs, curriculum, and personnel are stable
  • Resonated with proponents of academic freedom
  • Much appreciation for framework’s focus on

contin inuous s im impr prove vement

  • It’s not only about compliance, reporting results
  • Deadl

eadlin ines re rela lativ ively ly stab able le & eas easy to re recall

  • Calendar of reporting expectations identified well

in advanced (Oct. 1)

Challe llenges

  • So

Some Su Supp pport rt Se Services s exp xperi rienced an anxi xiety abo about ut lo long ng-term plann plannin ing

  • Priorities/expectations vary year-to-year
  • Organizational/leadership/staff in flux
  • Very different timeline than legacy institutions
  • So

Some confu nfusio ion abo about ut re repo porting g exp xpectations ns & pr process ss

  • Close out “transitional” framework before

phasing into “new” framework

  • New framework was presented nearly a year

before plans were due.

37

slide-38
SLIDE 38

Step 3: Developing a New Planning Template (spring 2017)

  • Identify major challenges with previous template
  • Research best practices
  • Create a unique template that encourages programs/units to

detail il th their ir ass ssessment exp xpectatio ions and methodolo logie ies so that non-experts can understand 38

slide-39
SLIDE 39

Comprehensive Assessment Plan - Reporting Template

39

  • Master document for planned

assessment activities during four-year

  • period. The aim was to produce plans

that were:

  • Th

Thou

  • ughtful

ful

  • Mis

Missio ion-driv iven

  • Det

Detaile led

  • Replic

licable le

  • Con

Continuit ity of

  • f As

Assessment

slide-40
SLIDE 40

Template’s Advantages and Challenges Advantages

  • Ver

ery De Detail iled Asses essment Pla lan

  • Open-ended sections required thorough

descriptions

  • Addressed previous issues with vague,

incomplete plans

  • Tem

emplate Easy to Sh Share re, Revis vise, & Su Submit it

  • MS Word document submitted to Liaison
  • Liaison routed all plans to our office

Challe llenges

  • Very De

Detail iled Assessment Pla lan

  • Very different approach than legacy

institutions

  • Few complaints it was asking for too much
  • De

Defining Terms

  • Outcomes vs. objectives vs expectations
  • So

Some confusion about re report rting secti ections

  • Despite litany of workshops and resource

materials, people did not always follow directions

40

slide-41
SLIDE 41

Preparing Faculty/Staff for Submitting Plans (spring 2017 - fall 2017)

  • 12

12 – Presentations

41

March 30 – April 18, 2017 July 31 – Sept. 12, 2017

  • Description
  • Timeline
  • FAQs
  • Examples/Visuals
  • Guidance Docs
  • Met with

ith:

  • 6 (of 7) Academic Colleges
  • 14

14 (of 21) Divisional Areas

slide-42
SLIDE 42

Compliance with Assessment Planning

42

Educational Programs 98% Compliance Support Services 96% Compliance

slide-43
SLIDE 43

Step 5: Develop Institutional Rubric for Evaluating Assessment Plans

43

Expected Outcomes Measures & Benchmarks Data Collection Process Approaches for Evaluation and Analysis Planning to Close the Loop

All 9 Rubric Element Scored From 0-3 0 = Needs Improvement; 1 = Progressing; 2 = Mature; 3 = Commendable

slide-44
SLIDE 44

Step 6: Develop Meta-Assessment Process in Academic Colleges (Apply rubric and provide feedback)

Coll

  • llege Lia

Liais isons Organized Review Process

  • 6 of 7 colleges used committees
  • All colleges used institutional rubric to

evaluate the quality of assessment plans

  • 6 of 7 colleges completed rubric

calibration training (May – Sept. 2017)

44

slide-45
SLIDE 45

Feedback Summary Reports – Educational Programs

45

11.1%

33.3% 27.8% 44.4% 50.0% 50.0% 16.7% 16.7% 38.9% 38.9% 44.4% 38.9% 33.3% 44.4% 44.4% 66.7% 83.3% 55.6% 55.6% 22.2% 33.3% 11.1% 5.6% 5.6% 16.7% 5.6% 5.6%

  • 9. Course/learning experiences are aligned to and support achievement of

student learning outcomes.

  • 8. The multi-year assessment timeline for this *SLO-Measure pair includes all

phases of assessment and support continuous improvement.

  • 7. Robust approaches to evaluating and analyzing assessment activities have

been developed.

  • 6. Measures are explicitly aligned to student learning outcomes (direct

relationship between measures and SLOs)..

  • 5. Benchmarks for success provide meaningful comparisons (reference points)

for program self-study and improvement.

  • 4. A clear plan for frequent/systematic data collection has been developed.
  • 3. Direct measures are selected for the student learning outcome.
  • 2. SLO measure/assessment activity is described in detail.
  • 1. Statement of expected student learning outcome (SLO) is operationalized

using specific and measurable action verbs and student-centered language.

Meta-Assessment Results for Educational Programs in the College of Sciences

Needs Improvement Progressing Mature Commendable

Note: These figures include rubric scores for the 18 programs that completed a comprehensive assessment plan in the college.

slide-46
SLIDE 46

Feedback Process Advantages and Challenges - Colleges Advantages

  • Leg

Legitimiz izes s re revie view pr process ss

  • Colleges take ownership of review process
  • Feedback provided by peers
  • Deve

velo lops s asse assess ssment infr infras astructure

  • Increases efficiency (train the trainer)
  • Key individuals master assessment reporting

expectations

  • Rubr

Rubric sc scores sho show pat pattern rns

  • Strengths/Weaknesses easily identified
  • Feedback easy to share with programs
  • Helps us develop future workshops

Challe llenges

  • Unc

ncertainty abo about ut fee eedb dback im imple plementatio ion

  • Outsourcing the review process made it difficult to

track which programs implemented feedback

  • Lim

Limited fam amil ilia iarity with ith plans plans @ ins instit titution- le leve vel

  • Outsourcing the review process reduces institutional

staff’s knowledge about plan contents

  • Limits our ability to share ideas about assessment

activities

46

slide-47
SLIDE 47

Step 6: Developing an Annual Results Template (spring 2018)

  • Identify major challenges with previous template
  • Research best practices
  • Create a unique template that adapts to diff

ifferent sch schedule les for rep eportin ing res esult lts and encourages programs/units to document and describe the use res esult lts for se seekin ing im improvements 47

slide-48
SLIDE 48

Annual Assessment Results Report in Tk20 - Template

48

slide-49
SLIDE 49

Annual Report Template Advantages and Challenges Advantages

  • Al

Allo lows pro progs/units to

  • doc

document asse assessment ac activities s in n one

  • ne tem

emplate

  • Previous templates were split (results vs. use of

results)

  • Plenty of space to describe and attached supporting

documentation

  • Tem

emplate ac accommodates vary arying sc schedule les

  • Given framework’s flexibility, assessments and

improvements might be reported at different times, for each outcome

  • Das

Dashboard vie iew of

  • f com
  • mpli

liance

  • Helps us reach out to those who need more support
  • Tk

Tk20 is s lon

  • ng ter

erm solu solution

Challe llenges

  • Ass

Assessment coo

  • ordinators som

sometimes sc scra ramble to

  • find dat

data to

  • repo

report

  • Annual report is retrospective, some experience

difficulties finding data that should have been collected last year

  • Plans do not always pan out
  • Ide

Identifying user users and and gra granting ac access

  • Contact lists not always complete, up-to-date
  • Tra

rain inin ing irr rregular user users

  • Attendance at trainings is not 100%
  • Assessment coordinators will need to complete these

annually

49

slide-50
SLIDE 50

Step 7: Preparing Faculty/Staff for Submitting Annual Report in Tk20 – Update our website

50

  • Description
  • Timeline
  • FAQs
  • Examples/Visuals
  • Guidance Docs
slide-51
SLIDE 51

Cont’d: Preparing Faculty/Staff for Submitting Annual Report in Tk20 – Face-to-Face Trainings

51

(April to September 2018)

Assessment Area Trainings Attendees Educational Programs 18 89 Support Service Units 10 67 T

  • tal Trainings

28 156

Note: Trainings held on both campuses

slide-52
SLIDE 52

52

https://www.utrgv.edu/excellence/tk20/videos/ (June to September 2018)

Cont’d: Preparing Faculty/Staff for Submitting Annual Report in Tk20 – Video Tutorials

Video Views

How to log in (SLO and Unit)

164

Creating Outcomes (SLO)

96

Plan Data Entry (SLO)

92

Creating Outcomes (Units)

86

Plan Data Entry (Units)

119

T

  • tal (AVG)

557 (111.4)

slide-53
SLIDE 53

Step 8: Integrating Meta-Assessment Process for all Colleges/Divisional Areas

53

October 1, 2018

  • 2017-2018 Annual Assessment

Results Reports Due September 14 & 28, 2018

  • Final Tk20 Trainings
  • Announcements &

Reminders for fall 2018 September 4, 2018 Jan 07, 2019

  • College/Divisional Feedback Summary

Reports & Rubric Scores Due

  • Revised 2017-2018 Annual Reports Due

Priorities:

1.

  • Support. Tk20 Trainings; Rubric Calibration Trainings (colleges/divisions)

2.

  • Compliance. 2017-2018 Annual Report Must be Submitted by Jan. 7, 2019

3.

  • Quality. Colleges/Divisional Areas Engage in Review Process;

Programs/Units Revise and Resubmit 2017-2018 Annual Reports Per Feedback

  • College/Divisional Review of Annual Reports
  • Assessment Coordinators Revise and Resubmit Annual Reports
slide-54
SLIDE 54

In summary: We made a concerted effort to develop a culture

  • f assessment and continuous improvement by focusing on:

54

  • Sup

Support

  • rt. Develop institutional infrastructure for supporting

programs and units

  • Com
  • mplia
  • liance. Ensure that a critical mass of programs and

units participate in institutional assessment process

  • Qua

uali

  • lity. Improve the quality of assessment activities and

the reporting of those activities

slide-55
SLIDE 55

Questions/Comments?

  • Laura Saenz laura.saenz@utrgv.edu
  • Carlos Cuellar carlos.cuellar@utrgv.edu
  • Website: http://www.utrgv.edu/excellence/assessment

55