Higher Education Assessment Workshop B: Robust Results NEASC - - PowerPoint PPT Presentation

higher education assessment workshop b robust results
SMART_READER_LITE
LIVE PREVIEW

Higher Education Assessment Workshop B: Robust Results NEASC - - PowerPoint PPT Presentation

Higher Education Assessment Workshop B: Robust Results NEASC Annual Meeting NATASHA JANKOWSKI, DIRECTOR, NATIONAL INSTITUTE FOR LEARNING OUTCOMES ASSESSMENT CHRISTOPHER HOURIGAN, DIRECTOR OF INSTITUTIONAL RESEARCH AND PLANNING, RHODE ISLAND


slide-1
SLIDE 1

Higher Education Assessment Workshop B: Robust Results

NEASC Annual Meeting

NATASHA JANKOWSKI, DIRECTOR, NATIONAL INSTITUTE FOR LEARNING OUTCOMES ASSESSMENT CHRISTOPHER HOURIGAN, DIRECTOR OF INSTITUTIONAL RESEARCH AND PLANNING, RHODE ISLAND COLLEGE RACHAEL DIPIETRO, RESEARCH SPECIALIST, NORWALK COMMUNITY COLLEGE

slide-2
SLIDE 2

NILOA

NILOA’s mission is to discover and disseminate effective use of assessment data to strengthen undergraduate education and support institutions in their assessment efforts.

  • SURVEYS ● WEB SCANS ● CASE STUDIES ● FOCUS GROUPS
  • OCCASIONAL PAPERS ● WEBSITE ● RESOURCES ● NEWSLETTER ●

PRESENTATIONS ● TRANSPARENCY FRAMEWORK ● FEATURED WEBSITES ● ACCREDITATION RESOURCES ● ASSESSMENT EVENT CALENDAR ● ASSESSMENT NEWS ● MEASURING QUALITY INVENTORY ● POLICY ANALYSIS ● ENVIRONMENTAL SCAN ● DEGREE QUALIFICATIONS PROFILE ● TUNING ● LEARNING SYSTEM www.learningoutcomesassessment.org

slide-3
SLIDE 3
slide-4
SLIDE 4

Live Polling

At which levels do you use assessment results most often?

  • 1. Institution-level
  • 2. Department/Program-level
  • 3. Course-level
slide-5
SLIDE 5

Use of Assessment Results

1.98 1.97 1.63 1.6 1.17 0.5 1 1.5 2 2.5 3 3.5 4 Curricular requirements/courses Department/program School/college Institution Co-curriculum

Very Much, Quite a Bit, Some, Not at all

slide-6
SLIDE 6

Write Outcomes Identify Assessments

Gather Results

Package Results Submit Reports

ACCREDITATION/ PROGRAM REVIEW

slide-7
SLIDE 7
slide-8
SLIDE 8
slide-9
SLIDE 9

Principles of Local Practice

Develop specific, actionable learning outcomes statements. Connect learning outcomes with actual student demonstrations of their learning. Collaborate with relevant stakeholders, beginning with the faculty. Design assessment approaches that generate actionable evidence about student learning that key stakeholders can understand and use to improve student and institutional performance. Focus on improvement and compliance will take care of itself.

slide-10
SLIDE 10

But…

What are we really trying to do???

slide-11
SLIDE 11

Institutional or Program Improvement

slide-12
SLIDE 12

Learning Improvement

slide-13
SLIDE 13

Name Expectations for Learning Communicate Expectations to Learners

Collect Student Work Determine Extent of Learning Strategize New Student Success Plans

IMPROVEMENT

slide-14
SLIDE 14
slide-15
SLIDE 15

What does good assessment look like for us?

Why do we think that what we are doing, for these students, will lead to enhanced learning, at this time? What processes and structures would we need in place to achieve learning improvement? What type of data would we need to collect that we aren’t right now?

slide-16
SLIDE 16

Standard 8: Educational Effectiveness

The institution demonstrates its effectiveness by ensuring satisfactory levels of student achievement on mission-appropriate student

  • utcomes. Based on verifiable information, the institution understands

what its students have of their education and has useful evidence about the success of its recent graduates. This information gained as a result is used for planning and improvement, resource allocation, and to inform the public about the institution. Student achievement is at a level appropriate for the degree awarded. 8.8 The results of assessment and quantitative measures of student success are a demonstrable factor in the institution’s efforts to improve the learning opportunities and results for students.

slide-17
SLIDE 17

Group Discussion

At your tables, please share with the group your program-level assessment efforts and one challenge related to using results that you are currently experiencing.

slide-18
SLIDE 18

NEASC ANNUAL MEETING December 12-15, 2017 Boston, MA Higher Education Assessment Workshop B: Robust Results

Christopher Hourigan, Director of Institutional Research & Planning

slide-19
SLIDE 19

RHODE ISLAND COLLEGE

– Rhode Island’s only Public Master’s level 4-yr institution; part of a three-institution system that includes the Community College of Rhode Island (CCRI) and the University of Rhode Island (URI) – Located in Providence, state capital and main urban center – Serves approximately 8,200 students at the undergraduate and graduate levels – 45% of undergraduates are first-generation students, and 35% identify as non-White – Programs in Arts & Sciences, Business, Education, Nursing, and Social Work – Major changes in college leadership over past two years: new President, Provost, and VP for Finance & Administration. New divisions of Student Success (with a VP) and Community, Diversity, and Equity (with an Associate VP) – Major upgrades to Physical Plant, including new academic spaces – Last five-year report submitted in 2016; last 10-year reaccreditation received in 2012 – Next review: Fall 2020

slide-20
SLIDE 20

ASSESSMENT AT RIC

  • 1. Academic Assessment
  • Program/Majors Outcomes
  • General Education Outcomes
  • 2. Institution-level Assessment
  • Student Engagement
  • Student Satisfaction
  • Student Success: Retention, Persistence, Graduation
slide-21
SLIDE 21

INFRASTRUCTURE FOR ASSESSMENT

  • 1. Committee on Assessment of Student Outcomes (CASO)
  • Led by faculty member and consists of faculty, Provost, and Director of Institutional

Research and Planning

  • Coordinates program-level assessment
  • Works jointly with Committee on General Education (COGE) to assess general

education

  • Supports institutional-level assessment
  • Sponsors a semiannual Assessment Colloquium to share assessment projects on

campus

  • 2. Institutional Research & Planning (IRP)
  • Conducts surveys used for institution-wide assessment
  • Prepares reports on retention, graduation, and other measures of institutional

effectiveness

  • Shares results with campus community
  • 3. Enrollment Management Team
  • New group developed and led by VP for Student Success
  • Charged with increasing enrollment, persistence, and graduation
  • Developing metrics of success and using real-time data for interventions
slide-22
SLIDE 22

USE OF ASSESSMENT RESULTS—ACADEMIC ASSESSMENT IN PROGRAMS

  • Accredited/Licensed and Non-Accredited/Licensed Programs at RIC
  • Programs are asked to prepare reports on their assessment

activities each academic year

  • For most programs, assessment results are used to add courses,

change the emphasis of courses, or alter assessment process

slide-23
SLIDE 23

ACADEMIC ASSESSMENT EXAMPLES—BIOLOGY AND ENGLISH

Biology English

Outcomes Measured Knowledge in Field of Biology Writing Competency Assessment Tool ETS Major Field Test (MFT) in Biology (given to every senior in Biology Capstone Course) Final paper in senior seminar Findings 1. In 50th percentile nationally on Organismal Biology, compared to 80-90th percentile on

  • ther test subscores.

2. Grades in introductory biology courses covering topics (111 and 112) do not correlate well with MFT subscore on Organismal Biology 3. Content of those introductory courses has grown, allowing less time to focus on

  • rganismal biology

1. Students taking English as the content major in Elementary Education did not perform as well as

  • ther English majors on:
  • Use of MLA citation format and

incorporation of secondary sources

  • Analysis of literature in historical and

cultural contexts 2. Findings attributed to fact that ELED students took fewer courses emphasizing these competencies than other majors Interventions 1. Introductory biology course sequence expanded with an additional course to ensure adequate coverage of organismal biology in two introductory courses. 1. Changed requirements for Elementary Education track, adding two courses that better aligned curriculum with state requirements for teacher certification and dropped requirement for senior seminar. Results 1. Results not yet available, as change was recently implemented. 2. Will determine effectiveness via MFT scores when students are seniors. 1. Rating of final papers in new courses demonstrated improvement in second competency, in particular.

slide-24
SLIDE 24

USE OF ASSESSMENT RESULTS—GENERAL EDUCATION

  • Current general education program began in 2012 and was

designed to meet eleven learning outcomes

  • Assessment has begun with three of these outcomes:

– Written communication – Research fluency – Critical/creative thinking

  • Team of faculty members from across the college evaluated

course artifacts to determine whether students are meeting

  • utcomes
  • At present, results of assessment efforts have been used mostly

to adjust the assessment process itself, rather than to make changes to the curriculum

slide-25
SLIDE 25

USE OF ASSESSMENT RESULTS—INSTITUTIONAL ASSESSMENT, STUDENT ENGAGEMENT & SATISFACTION

  • 1. Participated in NSSE for multiple years and used results to:

– Modify the advisement process – Create professional development opportunities for faculty to promote greater interdisciplinary learning – Items related to High-Impact practices support president’s “learning innovation” initiatives

  • 2. Homegrown Student Census survey administered every other year

– Captures student satisfaction with various aspects of their education at RIC – Results shared with groups of academic and student life leaders – Led to efforts to improve campus environment and customer service in various offices – Improvements noted in some offices/services and campus issues, such as Records & Registration, Safety & Security, and Classrooms

slide-26
SLIDE 26

USE OF ASSESSMENT RESULTS—INSTITUTIONAL ASSESSMENT, STUDENT ENGAGEMENT & SATISFACTION

2012 2014 2016 Safety & Security 3.53 3.69 3.70 Records Office 3.28 3.30 3.34 2010 2012 2014 2016 Classrooms 2.08 2.45 2.60 2.62 Examples of Improvements on RIC Student Census Survey Items

Scale: 1=Very Dissatisfied, 2=Dissatisfied, 3=Neither Satisfied nor Dissatisfied, 4=Satisfied, 5=Very Satisfied

Student Satisfaction with Offices/Services Student Rating of Aspects of RIC Campus

Scale: 1=Poor, 2=Fair, 3=Good, 4=Excellent

slide-27
SLIDE 27

USE OF ASSESSMENT RESULTS—INSTITUTIONAL ASSESSMENT, RETENTION, PERSISTENCE & GRADUATION

74.6% 46.2% 80% 50% 0.0% 10.0% 20.0% 30.0% 40.0% 50.0% 60.0% 70.0% 80.0% 90.0% One-Year Retention Rate Six-Year Graduation Rate

RIC Retention and Graduation Rates: Current and Goals

Current Goal

slide-28
SLIDE 28

USE OF ASSESSMENT RESULTS—INSTITUTIONAL ASSESSMENT, RETENTION, PERSISTENCE & GRADUATION

  • 1. Annual IRP Reports

– Retention & Graduation Rate report disaggregated by various factors, including race/ethnicity, gender, residency, etc. – Shared with senior leadership and used to monitor progress but has not directly led to interventions

  • 2. Survey of Non-Returning Students

– Piloted in Spring 2017 and designed to determine why students leave the college – Provided insight on different kinds of non-returning students – Key reasons for leaving were a mix of personal and institutional factors, such as family responsibilities, finances, and course availability – Will be shared with Enrollment Management team and used to support need for student services and potential changes in course scheduling and other programming

slide-29
SLIDE 29

USE OF ASSESSMENT RESULTS—INSTITUTIONAL ASSESSMENT, RETENTION, PERSISTENCE & GRADUATION

  • 1. Report on “Student Flow” by major and school to provide more

granular data on retention, persistence, and graduation for interventions at the department and school level

  • 2. Greater use of point-in-time data by enrollment management to

spur more immediate interventions that encourage students to re-enroll and stay on track to graduate – Report on non-enrolling students eligible to re-enroll and with holds preventing re-enrollment – Campaign to reach out to students to clear holds and encourage registration in upcoming semester – Monitor year-to-year progress through enrollment and persistence metrics – Compile and share data on course subscriptions and waitlists to begin conversations about relationship between course

  • fferings and student demand
slide-30
SLIDE 30

LESSONS LEARNED

  • Do not try to force standardization
  • Emphasize that point of assessment is to spur improvement, not punish low

performing programs or offices.

  • Level of commitment is higher when program assessment belongs to faculty, and

they are encouraged to see that assessment is FOR them.

  • Do not make assessment process too complex
  • Need a mix of high-level metrics and disaggregated data points for assessment to

be impactful

  • Administration must show that it takes results of assessment seriously and is willing

to provide necessary resources (financial and otherwise)

  • Find ways to share successful use of assessment results with broader campus

community

slide-31
SLIDE 31

Group Discussion

At your tables, what lessons can you take from this example? What questions do you still have?

slide-32
SLIDE 32

NORWALK COMMUNITY COLLEGE

  • Open admission, two-year public community college in

southwestern Fairfield County, Connecticut

  • Draws from two mid-sized cities and eight suburban towns
  • Serves approximately 6,000 credit students and an annual

enrollment of more than 13,000 with non-credit students

  • 59% of undergraduates are first-generation students, and

62% identify as non-White

  • Major changes in statewide configuration currently being

proposed with the 12 community colleges being proposed to merge into one college, with one accreditation

  • $2.3 million Title V grant received in 2016 to double

graduation rates

slide-33
SLIDE 33

USE OF ASSESSMENT RESULTS

  • 1. NEASC Graduation Rate Information Project (GRIP)
  • Response to Council for Regional Accrediting

Commissions (C-RAC) initiative for colleges with below a 15% IPEDS defined graduation rate

  • Report delved into student success data to

improve curriculum, pedagogy, and student services

  • Related to standards on Educational Effectiveness

(8.6-8.8)

slide-34
SLIDE 34

NEASC Graduation Rate Information Project (GRIP)

  • Service Area
  • Examined challenges including the wide economic

disparities, rapidly growing immigrant population, and highly educated population as context.

  • Used CHUMRA, Census data, and CCSSE to tell this story

while allowing the information to inform programmatic and advising decisions moving forward.

  • 5 year graduation rates
  • Examining students at the 5 year mark results in a 24.6% vs

9.3% graduation rate

  • Encouraged participation in the Voluntary Framework of

Accountability as an extra measure

slide-35
SLIDE 35

NEASC GRADUATION RATE INFORMATION PROJECT (GRIP)

  • Student Level Information
  • Developmental placement, GPA, PT/FT attendance,

momentum, retention, and graduation longitudinal datasets used to inform future strategies to improve graduation rates including:

  • Reforming remediation
  • Creating a first-year experience program
  • Leveraging scholarships to motivate students
  • Student success coaching
slide-36
SLIDE 36

USE OF ASSESSMENT RESULTS

  • 2. Title V Grant-

The Student Success Collaborative: Transforming Student Pathways to Credentials and Beyond

  • Maintaining Momentum and Beginning in Developmental Education
  • Remedial reformations both through PA 12-40 and based on campus best

practices

  • College level concurrent with remedial workshop
  • Transitional strategy free courses
  • SOAR
  • Signal Vine Texting for Scholarships and Enrollment Nudging
  • Start2Finish
  • Gateway Courses
  • Supplemental Instruction
  • Math Infusion into Freshmen Seminar Courses
  • Learning Commons
  • Virtual resources
  • Collaborative workspaces
slide-37
SLIDE 37

SIGNAL VINE- TEXT FOR SUCCESS

  • Castleman & Page (2014)

5% 9% 40% 17% 34% 49% 6% 7% 30% 9% 14% 29% 18% 23% 21% 11% 35% 47% 18% 19% 23% 8% 22% 40% 0% 10% 20% 30% 40% 50% 60% English Placement Math Placement FAFSA Transcripts Immunizations Registration

Task Completion Change (Citizen)

Batch 1 Pilot Citizen (94 students) Batch 1 Control (70 Students) Batch 2 Pilot (189 Students) Batch 2 Control (137 Students)

slide-38
SLIDE 38

LESSONS LEARNED

  • Control the pieces you can
  • Be open-minded that the questions being asked,

may not be the right questions and that the assessment process can be redesigned

  • Liberate the data!
  • Make sure results are being shared and

publicized

  • As systems change make sure databases reflect

these changes

  • Write down data definitions
slide-39
SLIDE 39

CATEGORY

(1) Where are the learning

  • utcomes for this

level/program published? (please specify) Include URLs where appropriate. (2) Other than GPA, what data/ evidence is used to determine that graduates have achieved the stated

  • utcomes for the degree? (e.g.,

capstone course, portfolio review, licensure examination) (3) Who interprets the evidence? What is the process? (e.g. annually by the curriculum committee) (4) What changes have been made as a result

  • f using the

data/evidence? (5) Date of most recent program review (for general education and each degree program)

At the institutional level: For general education if an undergraduate institution: List each degree program: 1. 2. 3. 4. 5. 6.

E-Series Forms: Making Assessment More Explicit Option E1: Part a. Inventory of Educational Effectiveness Indicators

Institutions selecting E1a should also include E1b.

Note: Please see the Statement on Student Achievement and Success Data Forms (available on the CIHE website: https://cihe.neasc.org) for more information about completing these forms.

When engaging with e series forms, ask yourself – who has access to what sorts of data on learning, where it was collected and where it goes? What are the data flows within your

  • rganization?
slide-40
SLIDE 40

Alignment and Mapping

How do courses build towards mastery through repetition and increasing expectations for particular outcomes? How do assignments and activities elicit student demonstrations of a specific learning

  • utcome?

How do individual faculty contribute to this collective work in their courses?

slide-41
SLIDE 41

Resources

slide-42
SLIDE 42

Live Polling

How confident do you feel about your ability to communicate effectively your assessment results to a variety of audiences?

  • 1. Very confident
  • 2. Confident
  • 3. Not confident
slide-43
SLIDE 43

Evidence-based Storytelling

Evidence of student learning is used in support of claims or arguments about improvement and accountability told through stories to persuade a specific audience. Need to tell our story and help students tell theirs.

slide-44
SLIDE 44

Excellence in Assessment Designations

National recognition program for campus assessment leaders at two levels

Evaluation based on the National Institute for Learning Outcomes Assessment (NILOA) Transparency Framework Focus on campus-wide assessment – including student affairs & external stakeholders Joint project of the VSA, NILOA, and the Association of American Colleges & Universities (AAC&U)

slide-45
SLIDE 45

Why did we create the EIA Designations?

Faculty & instructional staff Student Affairs staff Accreditors, governing boards Employers, subsequent institutions Students & Alumni

slide-46
SLIDE 46
slide-47
SLIDE 47
slide-48
SLIDE 48

The Learning Systems Paradigm

slide-49
SLIDE 49

Action Plan

Please take a moment to write down two next steps you can take when you return to your campus to advance conversations on robust results. With whom might you connect that you haven’t? What resources do you need to move forward?

slide-50
SLIDE 50

Thank You!!

slide-51
SLIDE 51

Causal Statements

The ability to make causal claims about our impact on students and their learning Institutional structures and support + student = enhanced learning

slide-52
SLIDE 52

Difficulty of Causal Statements

Mobility of students Untracked changes Changes in courses add up to program level change Lack of clarity on what even counts as a program Life Levels at which use occurs Longer than a year cycle Loosely coupled relationships

slide-53
SLIDE 53

But…

Toulmin (2003) Evidence Claim Warrant Warrants Arguments

slide-54
SLIDE 54

Theories of Change

Why do we think the changes we make will lead to better

  • utcomes?

What is assumed in the changes we select as it relates to how students understand and navigate higher education?

slide-55
SLIDE 55

For instance…

Coverage and content Opportunities and support Intentional, coherent, aligned pathways Within each of these is the belief about root causes – why students were not learning or not meeting the outcome and the mechanism by which the institution can help them succeed