How Computer Skills Testing Improved Our Student Success Rates - - PowerPoint PPT Presentation

how computer skills testing improved
SMART_READER_LITE
LIVE PREVIEW

How Computer Skills Testing Improved Our Student Success Rates - - PowerPoint PPT Presentation

Gateway Technical College: How Computer Skills Testing Improved Our Student Success Rates Presenters : Raymond Koukari, Jr., Gateway Technical College Jodi Noll, Labyrinth Learning AGENDA Our Problem/Background Driving Issue Initial


slide-1
SLIDE 1

Gateway Technical College: How Computer Skills Testing Improved Our Student Success Rates

Presenters: Raymond Koukari, Jr., Gateway Technical College Jodi Noll, Labyrinth Learning

slide-2
SLIDE 2

AGENDA

  • Our Problem/Background – Driving Issue
  • Initial Steps
  • Refining the Solution
  • Testing
  • Creation of Multiple courses
  • Implementation
  • Process
  • Challenges
  • Outcomes
  • Recommendations
slide-3
SLIDE 3

Driving Issue – Core Ability

“All students will demonstrate essential computer skills by graduation” Accreditation HLC and ACBSP process showed that they want proof Are you doing what you say?

slide-4
SLIDE 4

Driving Issue – High Course Failure Rate

21.5% of students were not completing our PC Basics course

Why put students into a course that they aren't ready for & are against the odds of succeeding in?

  • Wendy Revolinski
slide-5
SLIDE 5

What did we do?

Instructor Meetings:

Why the high failure rate?

slide-6
SLIDE 6

Conclusion

Unprepared /unskilled students

Needs Identified:

  • New, developmental course
  • Method to determine student

placement Wide range of skills left some students unprepared, confused, or bored

slide-7
SLIDE 7

What did we do?

Determined needs for a tool to test students for course placement

√ Agile √ Office Suite √ Gmail √ Versions √ Mirror campus environment

slide-8
SLIDE 8

What did we do?

Consulted the experts

√ What should be tested √ Assessment design √ Technical solution √ Testing Process

slide-9
SLIDE 9

Test Design

Started with recommendations from vendor partner (Labyrinth) 3 rounds of feedback / design assessment

  • Internal review
  • Student testing: on-campus beta

 Summer course  9 sections  All students

  • Administrative assistants

Custom URL that has all the details for them to check out later.

What should we test?

slide-10
SLIDE 10

Test Timing & Location

When and Where to Take the Test?

  • Location selected: Testing Center
  • Coordinated with Compass testing
  • Following compass test, students click link

and are taken to digital literacy test

  • Started testing April 2013

Concern: Extended testing might be too long Results: Most students tested in just 20

  • minutes. A few tested

up to an hour.

slide-11
SLIDE 11

Solution Details

  • All students in technical and associate degree programs tested
  • 2 new courses

3 CR Computers for Professionals 1 CR Basic Computing (NEW)

  • 3 placement options

Score Basic Computing Computers for Professionals Credit for Prior Learning

0 - 49 X 50 – 89 X 90 – 100 X

slide-12
SLIDE 12

Gaining Institutional Support

  • Issue has impact across all divisions
  • Failure in early classes = higher drop out

rates

  • Broader problem involving all students
  • GOAL: Help students be more successful, not

determine admission

PROVOST SUPPORT RECEIVED

Faculty Faculty Librarians Administrators Faculty Admissions Counselors Assessment Center Students Provost

slide-13
SLIDE 13

Determining Funding

Dean of Enrollment Services

  • Responsible for all testing centers

Plan A: Increase in fee to cover cost Compass testing – reading, writing, math

  • $20 fee
  • $5 wiggle room vs. actual cost and fee
  • Computer assessment = $3

Plan B: No increase in fee for students

slide-14
SLIDE 14

Our Data

  • First Trials

Fall 2013 Semester

# Students # Sect 103-142

Basic Computing

30 1.67 103-143

Computing for Professionals

134 7.44 CPFL

Test-out Credit

13 Total # Scores 177

slide-15
SLIDE 15

Our Data

  • Fall & Spring Semester

Fall 2013 Semester

# Students # Sect % Overall 103-142

Basic Computing

206 11.44 15% 103-143

Computing for Professionals

1044 58.00 78% CFPL

Test-out Credit

85 6% Total # Scores 1335

Spring 2014 Semester

# Students # Sect % Overall 103-142

Basic Computing

271 15.06 15% 103-143

Computing for Professionals

1420 78.89 78% CFPL

Test-out Credit

119 7% Total # Scores 1810

slide-16
SLIDE 16

Critical Considerations/Recommendations

  • Early involvement of all divisions
  • Share vision & implementation plan
  • Including others early in planning phase
  • Pass along information to all
  • Clear on goals and plan
  • Engage a wide audience to gather

support

  • Challenge - Business & IT get it, but not

the whole college

  • Try to get support from the broader

audience

  • Gain commitment to change
  • Faculty
  • Senior management
  • Provost has to be on board
  • Data is key to commitment

Data is key to gaining commitment

slide-17
SLIDE 17

Critical Considerations / Recommendations

  • Get a Sponsor in test area
  • Seek IT support
  • Click another link, and go into the

computer assessment

  • Put another “test field” in the system
  • Use API to automate the process to

integrate (15 min)

  • Conduct a pilot
  • Test process
  • Test student results
slide-18
SLIDE 18

Sound Bites from Instructors

Students feel more comfortable coming into Computers for Professionals. As an instructor, providing that basic class has made my starting days so much easier in Computers for Professionals. Most of the students come in with the basics, and I can start right in teaching what is suppose to be taught in that class.

– C. Wooster

It allows me to give extra attention to the students who are just beginning and I no longer feel like I'm neglecting the students who need more help in getting started.

– A. Perkins

I think the computer skills assessment has greatly helped Computers for

  • Professionals. I wish it would have been implemented years ago. It was very

difficult to teach this course when you have people who have no computer skills at all - yes they are out there.

– W. Revolinski

slide-19
SLIDE 19

Testing Platform – Labyrinth Learning eLab SET

  • Over 2000 questions covering the Internet,

email, computer concepts, Microsoft Office

  • Realistic simulation environment mimics real

world

  • Select questions and create your own
  • Flexible test implementation and student

registration

  • Track and analyze scores by student or group
  • Affordable license keys
  • Unlimited educator access
  • Responsive customer support
slide-20
SLIDE 20

Testing Platform - Labyrinth Learning

DEMO

slide-21
SLIDE 21

Outcomes

Direct Results

  • 90% of Instructors said

the students were better prepared

  • Measured key “core

ability” statement

  • Reduced course failure

rates

  • Achieved higher

retention rate Student Feedback

  • Most students were

not surprised by results

  • They had a sense of

relief that they would be placed in a course that was right for them

slide-22
SLIDE 22

Outcomes – Prove it!

  • Source of Actionable Data
  • Accrediting agencies
  • Placement
  • Course improvements
slide-23
SLIDE 23

Justification – Interface Grant

slide-24
SLIDE 24

Ongoing Process

  • Future
  • 18 month post-test 11/14
  • Comparison of students

pre-test and post-test

  • Refine assessment with

survey feedback