lessons learned in developing
play

Lessons Learned in Developing and Implementing a Quality Rating - PowerPoint PPT Presentation

Lessons Learned in Developing and Implementing a Quality Rating System in New York State Medicaid Managed Care Lindsay Cogan, PhD, MS Director, Division of Quality Measurement Office of Quality and Patient Safety New York State Department of


  1. Lessons Learned in Developing and Implementing a Quality Rating System in New York State Medicaid Managed Care Lindsay Cogan, PhD, MS Director, Division of Quality Measurement Office of Quality and Patient Safety New York State Department of Health Academy Health Annual Research Meeting State Health Policy Interest Group

  2. 2 Overview • New York State • Quality Rating System • Key Decision Points • Challenges • Lessons Learned

  3. 3 NYS Medicaid Managed Care • Public Health Law, Article 29 D, Section 2995 Mandated HMO Reporting • 1994 - 2000 – CMS approved 1115 Waiver for New York – the Partnership Plan begins to roll out – Mandatory Medicaid Managed Care – First Quality Performance Report • 2001 - 2011 – Uses of quality data expands – Consumer Guides (Pre-Quality Rating System) – Major growth in Medicaid managed care • 2012- Now – Benefit package changes – Population changes – Marketplace enrollment – Managed Care Final Rule

  4. 4 Managed Care Final Rule • CMS to establish a common framework for all states to use in implementing a Quality Rating System (QRS) • A public engagement process to develop a proposed QRS framework and methodology • States will have the flexibility to adopt alternative QRS • States will have three years after the final guidance from CMS to begin rating managed care plans

  5. 5 Quality Rating System Purpose: • Increase transparency • Allow plan comparison • Provide comparable information across different types of insurance

  6. 6 Key Decision Points • QRS framework • Measure selection • Grouping measures • Scoring methodology • Reference group

  7. 7 QRS Framework Performance Information • Transparency • Measure Selection • Plan Component • Grouping Measures Comparison • Information Goals Element • Rating Methodology Scoring Reference Group Federal Register/Vol. 78, No. 223 https://www.gpo.gov/fdsys/pkg/FR-2013-11- 19/pdf/2013-27649.pdf

  8. June 24, 2017 8 Measure Selection • Review existing health National Quality Forum plan measures • Measure type (clinical Feasibility quality, satisfaction) Performance Gap • Alignment across state Importance and Federal reporting requirements, QRS Reliability measure set – Important to consider special populations in Medicaid Alignment

  9. 9 Grouping Measures (1) Global Rating Summary Summary Indicator Indicator Domain Domain Domain Domain Composite Composite Composite Composite Composite Measure Measure Measure Measure Measure Measure Measure

  10. June 24, 2017 10 Grouping Measures (2) Weighting Explicit vs implicit Well Child 15m/ 3-6 Breast Cancer Screening Annual Dental Visit Childhood Immunization Cervical Cancer Screening Pharyngitis URI Chlamydia Screening Weight Assessment and Counseling Immunizations for Adolescents HPV Immunization

  11. 11 Scoring Methodology (1) • Process used to convert measure data to a common numeric scale – Standardized score (z-score, ANOP, t statistic) • Aggregate scores by different levels – Average measure scores to domain score • Convert scores to something meaningful, useful – The t statistic of a domain score is calculated and converted to a star rating by the percentile rank inferred from student’s t distribution – Overall rating is average of domain star ratings. The t statistic of the overall rating is calculated and converted to the 5-star scale by the percentile rank inferred from the student’s t distribution

  12. June 24, 2017 12 Scoring Methodology (2) • When the number of plans is less than 30, Percentiles of T-statistic Ratings t-statistic is an appropriate standardized 0<= Score Value < 10 1 star score that measures the distance of each 10<= Score Value < 30 2 stars plan to the statewide average in terms of 30<= Score Value < 70 3 stars standard error. • This percentiles in the t-distribution can 70< =Score Value < 90 4 stars measure plan’s true relative performance 90<= Score Value 5 stars given the small number of plans being compared.

  13. 13 Reference Group • A population of reporting units that is defined based on specification of a geographical region and/or time period. • Examples include: • All health plans nationally • All health plans statewide • Regional groupings of health plans

  14. 14 Challenges • Technologically complex • Reflective of Stakeholders − Consumers − Managed Care Plans − Advocates • Useful to people. • Easily understood by everyone (i.e. consumers, managed care plans). • Summarized, yet still allow drill-in to specific measurement areas

  15. 15 Lessons Learned • Creating your own State QRS lets state priorities and needs be incorporated into decisions. • Important that states are knowledgeable regarding the impact these decisions around the design of the QRS may have on their state health plan ratings.

  16. June 24, 2017 16 Questions Thank you to my co-authors Anne Schettine and Patrick Roohan. Contact information • Lindsay.Cogan@health.ny.gov

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend