Lessons Learned in Developing and Implementing a Quality Rating - - PowerPoint PPT Presentation

lessons learned in developing
SMART_READER_LITE
LIVE PREVIEW

Lessons Learned in Developing and Implementing a Quality Rating - - PowerPoint PPT Presentation

Lessons Learned in Developing and Implementing a Quality Rating System in New York State Medicaid Managed Care Lindsay Cogan, PhD, MS Director, Division of Quality Measurement Office of Quality and Patient Safety New York State Department of


slide-1
SLIDE 1

Lessons Learned in Developing and Implementing a Quality Rating System in New York State Medicaid Managed Care

Lindsay Cogan, PhD, MS Director, Division of Quality Measurement Office of Quality and Patient Safety New York State Department of Health Academy Health Annual Research Meeting State Health Policy Interest Group

slide-2
SLIDE 2

2

Overview

  • New York State
  • Quality Rating System
  • Key Decision Points
  • Challenges
  • Lessons Learned
slide-3
SLIDE 3

3

NYS Medicaid Managed Care

  • Public Health Law, Article 29 D, Section 2995 Mandated

HMO Reporting

  • 1994 - 2000

– CMS approved 1115 Waiver for New York – the Partnership Plan begins to roll out – Mandatory Medicaid Managed Care – First Quality Performance Report

  • 2001 - 2011

– Uses of quality data expands – Consumer Guides (Pre-Quality Rating System) – Major growth in Medicaid managed care

  • 2012- Now

– Benefit package changes – Population changes – Marketplace enrollment – Managed Care Final Rule

slide-4
SLIDE 4

4

  • CMS to establish a common framework for all states to

use in implementing a Quality Rating System (QRS)

  • A public engagement process to develop a proposed

QRS framework and methodology

  • States will have the flexibility to adopt alternative QRS
  • States will have three years after the final guidance

from CMS to begin rating managed care plans

Managed Care Final Rule

slide-5
SLIDE 5

5

Quality Rating System

Purpose:

  • Increase transparency
  • Allow plan comparison
  • Provide comparable

information across different types of insurance

slide-6
SLIDE 6

6

Key Decision Points

  • QRS framework
  • Measure selection
  • Grouping measures
  • Scoring methodology
  • Reference group
slide-7
SLIDE 7

7

QRS Framework

  • Measure Selection
  • Grouping Measures

Element

  • Rating

Methodology

Component

  • Transparency
  • Plan

Comparison

  • Information

Goals

Performance Information Scoring Reference Group

Federal Register/Vol. 78, No. 223 https://www.gpo.gov/fdsys/pkg/FR-2013-11- 19/pdf/2013-27649.pdf

slide-8
SLIDE 8

June 24, 2017 8

Measure Selection

  • Review existing health

plan measures

  • Measure type (clinical

quality, satisfaction)

  • Alignment across state

and Federal reporting requirements, QRS measure set

– Important to consider special populations in Medicaid

National Quality Forum

Alignment

Reliability Importance Performance Gap

Feasibility

slide-9
SLIDE 9

9

Grouping Measures (1)

Global Rating Summary Indicator Summary Indicator Domain Domain Domain Domain Composite Composite Composite Composite Composite Measure Measure Measure Measure Measure Measure Measure

slide-10
SLIDE 10

June 24, 2017 10

Grouping Measures (2)

Weighting

Explicit vs implicit

Well Child 15m/ 3-6 Annual Dental Visit Childhood Immunization Pharyngitis URI Weight Assessment and Counseling Immunizations for Adolescents HPV Immunization

Breast Cancer Screening Cervical Cancer Screening Chlamydia Screening

slide-11
SLIDE 11

11

  • Process used to convert measure data to a common

numeric scale – Standardized score (z-score, ANOP, t statistic)

  • Aggregate scores by different levels

– Average measure scores to domain score

  • Convert scores to something meaningful, useful

– The t statistic of a domain score is calculated and converted to a star rating by the percentile rank inferred from student’s t distribution – Overall rating is average of domain star ratings. The t statistic of the overall rating is calculated and converted to the 5-star scale by the percentile rank inferred from the student’s t distribution

Scoring Methodology (1)

slide-12
SLIDE 12

June 24, 2017 12

Scoring Methodology (2)

  • When the number of plans is less than 30,

t-statistic is an appropriate standardized score that measures the distance of each plan to the statewide average in terms of standard error.

  • This percentiles in the t-distribution can

measure plan’s true relative performance given the small number of plans being compared.

Percentiles of T-statistic Ratings 0<= Score Value < 10 1 star 10<= Score Value < 30 2 stars 30<= Score Value < 70 3 stars 70< =Score Value < 90 4 stars 90<= Score Value 5 stars

slide-13
SLIDE 13

13

  • A population of reporting units that is defined

based on specification of a geographical region and/or time period.

  • Examples include:
  • All health plans nationally
  • All health plans statewide
  • Regional groupings of health

plans

Reference Group

slide-14
SLIDE 14

14

Challenges

  • Technologically complex
  • Reflective of Stakeholders

− Consumers − Managed Care Plans − Advocates

  • Useful to people.
  • Easily understood by everyone (i.e. consumers,

managed care plans).

  • Summarized, yet still allow drill-in to specific

measurement areas

slide-15
SLIDE 15

15

Lessons Learned

  • Creating your own State QRS lets state priorities and

needs be incorporated into decisions.

  • Important that states are knowledgeable regarding the

impact these decisions around the design of the QRS may have on their state health plan ratings.

slide-16
SLIDE 16

June 24, 2017 16

Questions

Thank you to my co-authors Anne Schettine and Patrick Roohan. Contact information

  • Lindsay.Cogan@health.ny.gov