madison metropolitan school district debunking urban
play

Madison Metropolitan School District: Debunking Urban Education Myths - PowerPoint PPT Presentation

Madison Metropolitan School District: Debunking Urban Education Myths Bo McCready, Quantitative Research Supervisor Beth Vaade, Qualitative Research Supervisor Madison Metropolitan School District Madison, WI Introduction Five Myths: 1. Student


  1. Madison Metropolitan School District: Debunking Urban Education Myths Bo McCready, Quantitative Research Supervisor Beth Vaade, Qualitative Research Supervisor Madison Metropolitan School District Madison, WI

  2. Introduction Five Myths: 1. Student demographics have been constant over time 3. MMSD schools are in fine condition…or are falling apart 4. State accountability systems account for differences in student populations 5. Test score distribution patterns vary across demographic groups

  3. About 27,500 students 43% white, 21% Hispanic/Latino, 18% Black or African-American, 9% multiracial, 9% Asian 49% low-income, 27% English Language Learners, 14% Students with Disabilities 50 schools and approximately 5,000 staff City of Madison is home of University of Wisconsin, state government, large insurance and healthcare industries Large disparities between demographic groups across measures (standardized tests, graduation, attendance, behavior)

  4. Not our primary tool for on-demand data (alternate tool exists for staff that aligns with state DPI’s primary tool) Used by the Research & Program Evaluation Office for specific cases, including: Making specific arguments visually (today’s focus) Data disaggregation for the public Producing visuals for reports and presentations

  5. What was our myth? Madison schools serve the same students they always have served in the past Why did it matter? Our veteran staff speak to new challenges: Teachers across the district talk anecdotally about the changes they have observed Old instructional approaches may not work as well as populations change Key strategic planning efforts are underway: Significant future planning activities since leadership change in 2013 Understanding of where we have been enhances understanding of where we are going

  6. This visualization helped us see how much (or how little) our schools changed across 25 years. Graph shows percent students of color We showed that (vertical) and percent MMSD has changed low-income quite a bit over time, (horizontal) with students of color and low-income students at least doubling Graph shows total enrollment color- coded by race/ethnicity

  7. We showed that some of our schools, like Franklin Elementary, changed little We also showed that other schools, like Wright Middle, are very different

  8. What happened next? This knowledge helped us reflect on staffing, instruction, and support services provided, as well as our family engagement efforts It also provided quick, on-demand context to new administrative hires not familiar with the landscape and development of MMSD We also used this information as context for strategic planning efforts, including Vision 2030 and the Long-Range Facilities Plan

  9. What was our myth? Teachers and school-based staff cannot work with large and complex datasets. Why did it matter? We administer a Climate Survey: About 21,000 total respondents with dozens of questions on each survey Our chief way of measuring success on Goal #3 of our Strategic Framework We had one hour with schools to help them process their results: District leadership institutes include a data review where schools use data to inform their planning We needed something simple, accessible, and robust

  10. These scorecards became the basis for our climate survey data review Positive responses by dimension Response distributions by question

  11. Scorecards were… printable (8.5”x11”) filterable by school, race, ELL, gender, income, disability, Advanced Learner status, and level accessible to the public (mmsd.org/climate) In that hour and beyond, we saw teachers and school leaders working with the data, digging into it at a level they had not done previously.

  12. What happened next? Climate survey data is now an integral part of our accountability systems districtwide – dense data, but accessible to wide audience and feels actionable This data is used to: Track progress on our Strategic Framework, School Improvement Plans, and Central Office Measures of Performance Monitor school success on key projects Inform professional learning and leadership development activities

  13. What was our myth? Schools in MMSD don’t need any upgrades, or schools in MMSD need massive updates. Why did it matter? Many schools expressed problems with their facilities: 14 of 50 schools were built 1940 or earlier, but others were relatively new or had recent additions Several high-demand schools consistently operated above 100% of capacity We had some new buildings or buildings with recent renovations, but they still had needs We went to referendum for critical updates in April 2015: We were behind in major facilities investments, including accessibility and secure entrances We needed multiple compelling ways to communicate our needs

  14. We created a cell for each school (actually a Dashboard) indicating key details about the school that would inform their relative need for construction projects Each component is a distinct table arranged into a Dashboard

  15. The end product was our “Periodic Table of Building Condition,” organized by attendance area and level, which provided a quick visual overview of building condition districtwide We showed that many schools were in excellent condition, but others faced critical needs

  16. What happened next? Our referendum passed in April 2015 with 82% support Our work in 2016-17 includes support of a long-range facilities plan for the district, and this tool will be updated to support planning and communication around that work

  17. What was our myth? Accountability ratings from the state’s DPI accounted for demographic differences between schools, so MMSD’s scores were directly comparable to other districts. Why did it matter? Wisconsin rates all schools and districts using a report card: Schools and districts receive grades and ratings out of 100 The large majority of the rating is based on standardized test scores MMSD’s ratings were not as favorable as surrounding districts: Suburban districts had higher ratings Our schools ranged from among the highest to among the lowest-rated in the state

  18. Filters on enrollment, locale type (e.g. suburban, rural), and CESA (geographic organizing area) allowed different ways to put scores into perspective This visualization helped us see clear linear trends between income/language/disability and overall scores We showed that the state’s weighting scheme was not accounting for demographic differences as they intended

  19. No district with more than 20% low-income students scored “Significantly Exceeds Expectations” and no district with less than 50% low-income students scored “Meets Few Expectations”

  20. What happened next? These observations helped our Board and community interpret our scores in context New accountability report cards issued by Wisconsin’s DPI in the fall of 2016 include new weighting schemes based on low-income student ratios – will these trends change?

  21. What was our myth? Test score distributions in MMSD are irregular, and the MAP assessment has a ceiling and can’t measure advanced learners effectively. Why did it matter? These perceptions were held among influential figures in the district: Board of Education members believed in irregular distributions Community advocates believed in the assessment having a ceiling We use MAP to report key milestones of district progress: We set goals to increase proficiency rates, but don’t always know how much change is realistic Distributions provide a different perspective on gaps

  22. We created score distributions across grades, filterable by year, subject, semester, and demographic group Red = minimal Yellow = basic Green = proficient Blue = advanced

  23. Grade 3 Reading Scores This helped us see Overall that distributions were roughly normal African-American across demographic groups; the difference was Low-income where the normal distribution was White centered

  24. What happened next? We learned that increasing proficiency doesn’t happen by focusing on students near cutoff points We also learned that differences in simple averages and proficiency rates across groups actually reflect differences across full distributions We can set realistic goals by observing where students lie along the achievement spectrum

  25. Visually appealing – brings in a different kind of learner than numbers alone and looks good in the process Intuitive – it takes no data training to consume and explore, even with new types of visuals, assuming we do our job right Empowers the user – allows people to own the experience, exploring and internalizing at their pace and asking their own questions Fights the “data is boring” narrative – debunks our own myth that data has to be dry and tough to read

  26. Expanding the variety of resources available on our website Moving into new data sources Iterative printing and animations Project management and planning support

  27. Please complete the session survey FPO from the Session Details screen in your TC16 app

  28. bmccready@madison.k12.wi.us envaade@madison.k12.wi.us

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend