placement systems using
play

Placement Systems Using Multiple Measures Elisabeth Barnett, CCRC - PowerPoint PPT Presentation

Student Assessment and Placement Systems Using Multiple Measures Elisabeth Barnett, CCRC SUNY CAO Meeting October 2018 Agenda Why use multiple measures for placement Selection of a multiple measures system Results of the SUNY


  1. Student Assessment and Placement Systems Using Multiple Measures Elisabeth Barnett, CCRC SUNY CAO Meeting October 2018

  2. Agenda Why use multiple measures for placement • Selection of a multiple measures system • Results of the SUNY research • Discussion •

  3. Students needing 1+ developmental education course (NCES, 2013) 100% 80% 68% 60% 40% 40% 20% 0% Community Colleges Open Access 4-Year Colleges 3

  4. Community college 8-year graduation rates (Attewell, Lavin, Domina, and Levey, 2006) 50% 43% 40% 28% 30% 20% 10% 0% Students Needing Remediation Students Not Needing Remediation 4

  5. Under-placement and Over-placement Placement According to Exam Developmental College Level  Student Ability Over-placed Developmental (English – 5%) (Math – 6%)  Under-placed College Level (English – 29%) (Math – 18%) 5

  6. COLLEGE 2: ENGLISH COLLEGE 2: MATH 0.2 0.2 0.18 0.18 0.16 0.16 14.5% 0.14 0.14 12.0% 0.12 0.12 9.9% 0.1 0.1 7.5% 0.08 0.08 0.06 0.06 4.8% 3.8% 0.04 0.04 2.7% 1.0% 0.02 0.02 0 0 GPA only Test only GPA and test Full model GPA only Test only GPA and test Full model 6

  7. Model R-Squared Statistics English R-Squared Statistics – Graphical Representation 0.12 0.1 0.08 0.06 0.04 0.02 0 College 1 College 2 College 3 College 4 College 5 College 6 College 7 GPA ACCUPLACER GPA + ACCUPLACER Full Model Slides available at: bit.ly/capr_ashe16 7

  8. Model R-Squared Statistics Math R-Squared Statistics – Graphical Representation 0.25 0.2 0.15 0.1 0.05 0 College 1 College 2 College3 College 4 College 5 College 6 College 7 GPA ACCUPLACER GPA + ACCUPLACER Full Model Slides available at: bit.ly/capr_ashe16 8

  9. Conclusions so far • Students placed into developmental education are less likely to complete. • Better assessment systems are needed. • HS GPA is the best predictor of success in college math and English. 9

  10. Multiple Measures Assessment 10

  11. Why Use Multiple Measures • Existing placement tests are not good predictors of success in college courses. • More information improves most predictions. • Different measures may be needed to best place specific student groups. 11

  12. Multiple Measures Options MEASURES SYSTEMS OR APPROACHES PLACEMENTS   Administered by college: Waiver system Placement into  1. Traditional or alternative Decision bands traditional courses   placement tests Placement formula Placement into 2. Non-cognitive assessments (algorithm) alternative  3. Computer skills or career Decision rules coursework   inventory Directed self-placement Placement into 4. Writing assessments support services 5. Questionnaire items Obtained from elsewhere: 1. High school GPA 2. Other HS transcript information (courses taken, course grades) 3. Standardized test results (e.g., ACT, SAT, Smarter Balanced) 12

  13. Possible Measures Which would you want to use? Why or why not? Type Examples Placement test Accuplacer ALEKS High school GPA, course grades, Self-report test scores From transcript Non-cognitive assessments GRIT Questionnaire SuccessNavigator or Engage Career inventory, computer skills Kuder Career Assessment Home grown computer skills test Writing examples Faculty-assessed portfolio Home-grown writing assessment 13

  14. Sources of HS transcript data Self-report research • • UC admissions uses self-report but The students bring a verifies after admission. In 2008, at 9 transcript. campuses, 60,000 students. No campus had >5 discrepancies b/w • The high school sends. reported grades and student transcripts (Hetts, 2016) • Obtained from state data files. • College Board: Shawn & Matten, • Self report. 2009: “Students are quite accurate in reporting their HSGPA”, r = .73. • ACT research often uses self-reported GPA and generally find it to highly Note: Consider using the 11 th correlated with students actual GPA: grade GPA. ACT, 2013: r = .84. 14

  15. Non-cognitive assessments Development of non-cognitive skills promotes students’ ability to think cogently about information, manage their time, get along with peers and instructors, persist through difficulties, and navigate the landscape of college…(Conley , 2010). Non-cognitive assessments may be of particular value for: Nontraditional (older) students. • Students without a high school record. • Students close to the cut-off on a test. • 15

  16. NC 1: Success Navigator NC 2: Engage Domains: Domains: • Academic discipline, commitment, • Motivation and skills, social self-management, support, social engagement, self-regulation supports Academic Success Index , includes: Advisor report also has: Projected 1 st year GPA • • Academic Success Index • Probability of returning next • Retention Index semester Also, Course Acceleration Indicator Correlation with GPA and retention, • especially Motivation scale. Recommendation for math or English acceleration 16

  17. NC 4: Learning and Study NC 3: Grit Scale Strategies Inventory (LASSI) Domains: Domains • Grit and self-control. • Anxiety, attitude, concentration, information processing, motivation, selecting main ideas, Provides score 1-5 on level of grit, self-testing, test strategies, time with 5 as maximum (extremely management, using academic gritty) and 1 as lowest (not all gritty). resources. Correlation with GPA and conscientiousness Correlation with GPA and retention. 17

  18. Concerns about the HS GPA (with thanks to John Hetts, 2016) Our test is different/better/more awesome. • Students really need developmental education. • High school GPA is only predictive for recent graduates. • Different high schools grade differently. • 18

  19. Our test is different/better/more awesome. NC ENGLISH NC MATH From Bostian (2016), North Carolina Waves GPA Wand, Students Magically College Ready adapted from research of Belfield & Crosta, 2012 – see also Table 1)

  20. Students would be better off going through developmental education. Developmental education student outcomes (Results from 8 studies, CCRC analysis 2015) 35 32 30 25 19 Positive 20 15 Null 15 10 Negative 6 5 5 2 0 Higher Level Students Lower Level Students 20

  21. HS GPA is a better predictor than test results for long time (from Hetts, 2016) MMAP (in preparation): correlations b/w predictor and success (C or better) in transfer-level course by # of semesters since HS

  22. For the most part, college grades stay parallel with feeder high school grades. (Bostian, 2016)

  23. Ways to Combine Measures Algorithms: • Placement determined by predictive model – Decision Rules: • New exemptions, cutoffs – Decision Bands: • “Bumping up” those in a test score range – Directed Self-placement: • Provide students with information; let them decide where they fit. –

  24. Algorithm Example College Level Placement High Yes HS Record, Resulting Accuplacer, Non- No Probability Exemptions? Cog data fed into of Success Algorithm Student Low Applies Remedial Level Placement

  25. Decision-Rule Example College Level Placement High High Yes HS Record Low Accuplacer No Exemptions? and/or Non-Cog Test Performance? Student Low Applies Remedial Level Placement

  26. Decision-Band Example College Level Placement Above Band High Yes HS Record Decision Accuplacer No and/or Non-Cog Exemptions? Band Test Performance? Student Band Below Low Applies Remedial Level Placement

  27. IES ANNUAL PRINCIPAL INVESTIGATORS MEETING ARLINGTON, VA \ 1.10.18 The CAPR Assessment Study 27

  28. DREAM February, 18 2015 Organization of CAPR MD MDRC CCRC Descriptive Study of Evaluation of The New Evaluation of New Developmental Mathways Project Assessment Practices Education (RCT in TX) (RCT in NY) Supplemental Studies

  29. IES ANNUAL PRINCIPAL INVESTIGATORS MEETING ARLINGTON, VA \ 1.10.18 Research on Alternative Placement Systems (RAPS) • 5 year project; 7 SUNY community colleges • Evaluation of the use of predictive analytics in student placement decisions. • Random assignment/implementation/cost study • Current status: beginning to look at impact 29

  30. IES ANNUAL PRINCIPAL INVESTIGATORS MEETING ARLINGTON, VA \ 1.10.18 Research Questions (Summary) 1. Do student outcomes improve when they are placed using predictive analytics? 2. How does each college adopt/adapt and implement such a system? 30

  31. SUNY Partner Sites A – CAPR/CCRC/MDRC B – Cayuga CC C – Jefferson CC D – Niagara County CC E – Onondaga CC F – Rockland CC G – Schenectady County CC H – Westchester CC Slides available at: bit.ly/capr_ashe16 31

  32. IES ANNUAL PRINCIPAL INVESTIGATORS MEETING ARLINGTON, VA \ 1.10.18 How Does the Predictive Analytics Placement Work? Develop Use formula to Use data from formula to place entering previous predict student cohort of cohorts performance students Set cut scores

  33. IES ANNUAL PRINCIPAL INVESTIGATORS MEETING ARLINGTON, VA \ 1.10.18 Early Findings Fall 2017 33

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend