Metrics & Scoring Committee Retreat, December 2, 2016 1 - - PowerPoint PPT Presentation

metrics scoring committee
SMART_READER_LITE
LIVE PREVIEW

Metrics & Scoring Committee Retreat, December 2, 2016 1 - - PowerPoint PPT Presentation

Metrics & Scoring Committee Retreat, December 2, 2016 1 Consent Agenda Welcome and introductions Review October minutes Approve at December 16 th meeting Agenda review 2 Committee Vision Continue to lead on and expand influence


slide-1
SLIDE 1

Metrics & Scoring Committee

Retreat, December 2, 2016

1

slide-2
SLIDE 2

Consent Agenda

 Welcome and introductions  Review October minutes Approve at December 16th meeting  Agenda review

2

slide-3
SLIDE 3

Committee Vision

Continue to lead on and expand influence of incentive measures to improve the health of Oregonians, through Health System Transformation and cross-system collaboration

  • Metrics & Scoring Committee Retreat, Oct 2015

3

slide-4
SLIDE 4

Measure Alignment

Part 1

4

slide-5
SLIDE 5

CCO Measure Selection: Current State

5

Metrics & Scoring Committee

Metrics Technical Advisory Workgroup Public Testimony: advocates,

  • rganizations, CCOs, providers

Stakeholder Input: Providers, CAPs, CACs, community

slide-6
SLIDE 6

CCO Measure Selection: Future State

6

Metrics & Scoring Committee

Metrics Technical Advisory Workgroup Public Testimony: advocates, organizations, CCOs, providers Stakeholder Input: Providers, CAPs, CACs, community

Health Plan Quality Metrics Committee Public Testimony

slide-7
SLIDE 7

Measure Alignment

7

Health Plan Quality Metrics Committee

Develops menu of aligned measures Metrics & Scoring Committee PEBB Board OEBB Board DCBS CCOs PEBB Carriers OEBB Carriers Health Plans on the Exchange

Providers

slide-8
SLIDE 8

Establishing Legislation (SB 440, 2015)

“The committee shall use a public process that includes an opportunity for public comment to identify health outcome and quality measures that may be applied to services provided by CCOs or paid for by health benefit plans sold through the health insurance exchange or offered by OEBB or PEBB. OHA, DCBS, OEBB, and PEBB are not required to adopt all of the measures identified by the committee, but may not adopt any measures that are different from the measures identified by the committee. The measures must take into account the recommendations of the metrics and scoring subcommittee and the differences in the populations served by CCOs and by commercial insurers.”

8

slide-9
SLIDE 9

Measure Alignment: The National Landscape

Oregon CCO Metrics and Scoring Committee

December 2, 2016

Michael Bailit

slide-10
SLIDE 10

Historical Context: 2013 Study

  • RWJF-funded study identified and collected

48 measure sets used by 25 states for a range of purposes

– Focus was on ambulatory care measures, i.e., no hospital, dental or LTSS measures – Analysis included Oregon’s CCO and PCPCH measures

  • Purpose of the measure was to determine to

what extent measure sets were aligned across the country and within individual states

10

Bazinsky K and Bailit M. The Significant Lack of Alignment Across State and Regional Health Measure Sets Bailit Health Purchasing, LLC September 10, 2013. See http://www.buyingvalue.org/.

slide-11
SLIDE 11

2013 Study: How many measures?

  • The study identified

1367 measures across the 48 sets,

  • f which 509 (37%)

were distinct.

  • Of the 509 distinct

measures, only 20% were used by more than one program.

11

slide-12
SLIDE 12

How often were the “shared measures” shared?

Measures not shared 80% 2 sets, 5% (28 measures) 3-5 sets, 4% (20 measures) 6-10 sets, 4% (21 measures) 11-15 sets, 3% (14 measures) 16-30 sets, 4% (19 measures) Shared measures 20%

12

Only 19 measures were shared by at least 1/3 (16+) of the measure sets

Most measures are not shared

Not that often…

slide-13
SLIDE 13

Non-alignment persisted despite a preference for standard measures

Standard 59% Modified 17% Home- grown 15% Undeter- mined 6% Other 3%

13

Measures by measure type n = 1367

Standard: measures from a nat’l source (e.g., NCQA, AHRQ) Modified: standard measures with a change to the traditional specifications Homegrown: measures that were indicated on the source document as having been created by the developer of the measure set Undetermined: measures that were not indicated as “homegrown”, but for which the source could not be identified Other: a measure bundle or composite

Defining Terms

slide-14
SLIDE 14

Programs were selecting different subsets of standard measures

14

Program A Program B Program C Program D Program E

  • While the programs may have been primarily using

standard, NQF-endorsed measures, they were not selecting the same standard measures

  • Not one measure was used by every program

– Breast Cancer Screening was the most frequently used measure and it was used by only 30 of the programs (63%)

slide-15
SLIDE 15

This phenomenon emerged relatively quickly

  • In 1989 large employers were complaining that

they had no data to assess the value generated by their health plans.

  • In response a group of health plans decided to

work with a small group of large employers to identify standard measures that could be used to demonstrate value.

  • The result of that effort - HEDIS “1.0” – was

released in 1991 with a limited number of quality measures – most of which were focused on prevention.

15

slide-16
SLIDE 16

And now here we are today

  • 25 years later, we are awash in

measures!

  • A 2016 cataloguing of federal

measure sets and selected other two national measure sets yielded 470 distinct measures in use – not including the MIPS specialist measures.

  • There have been 650 measures

endorsed by NQF.

16

slide-17
SLIDE 17

Many National Measure Sets…

Federal Sets:

  • CMMI Comprehensive Primary Care Plus (CPC+)
  • CMMI SIM Recommended Model Performance Metrics
  • CMS Core Set of Children’s Health Care Quality Measures for Medicaid and CHIP (Child Core Set)
  • CMS Core Set of Health Care Quality Measures for Adults Enrolled in Medicaid (Medicaid Adult Core

Set)

  • CMS Core Quality Measures Collaborative - ACO and PCMH/Primary Care Measures
  • CMS Health Home Measure Set
  • CMS Hospital Value-Based Purchasing
  • CMS Medicare Hospital Compare
  • CMS Medicare-Medicaid Plans (MMPs) Capitated Financial Alignment Model (Duals Demonstrations)
  • CMS Medicare Part C & D Star Ratings Measures
  • CMS Medicare Shared Savings Program (MSSP) ACO
  • CMS Merit-based Incentive Payment System (MIPS) - General Practice/Family Medicine, Internal

Medicine, and Pediatrics only

  • CMS Physician Quality Reporting System (PQRS), CMS EP EHR Incentive Clinical Quality Measures

(eCQMs), CMS Cross Cutting Measures

Non-Federal National Sets:

  • Catalyst for Payment Reform Employer-Purchaser Measure Set
  • Joint Commission Accountability Measure List

17

slide-18
SLIDE 18

The impact of non-alignment has been negative

  • Providers can’t respond to the large number of

measures they are being asked to improve.

  • The measure-related requirements on providers

for coding claims and medical records is a significant contributor to primary care burnout.

  • Why? Because a very high percentage of performance

measures are directed to primary and not specialty care.

  • There is lots of frustration with “checking the boxes.”
  • The constant change in measure set composition
  • nly worsens the problem for providers.

18

slide-19
SLIDE 19

The response…measure set alignment

  • Troubled efforts at the national level.

– AHIP/CMS Core Measure Set: developed with NQF with no provider, state, consumer or employer involvement – CPR Employer-Purchaser Measure Set: developed for purchasers – IOM’s Core Metrics: a conceptual framework more than a measure set – Many Medicare measures: MACRA/MIPS, MSSP, Stars, etc. create an impetus for multi-payer use but they don’t fit Medicaid and commercial populations very well

  • In the midst of this mess, states have been trying

to find a way to create measure rationality.

19

slide-20
SLIDE 20

State Measure Set Alignment: Washington

  • 2014 state legislation requiring development of a

statewide core measure set led to creation of the Washington State Common Measure Set for Health Care Quality and Cost

  • “Starter measure set” of 52 measures developed in six

months (modified in 2016 to 55 with added behavioral health measures).

  • Coordinating committee overseeing three technical work

groups: acute care, chronic care and hospital care.

  • Broad scope - interested in measures for payment,

monitoring, public health use...and more.

  • Final measure groupings: population, clinical and cost.

20

slide-21
SLIDE 21

Washington State’s Process

  • 1. Review aligned measures already commonly used in

Washington State and in national measure sets.

  • 2. Agree upon key topic areas (domains) to organize those

and other measures for review.

  • 3. Review the entire list, by domain, and discuss whether to

include the measure (yes/maybe/no).

  • 4. Take a second pass through the “yes” and “maybe” lists.
  • 5. Review additional measures recommended by group

members and non-group members and determine whether to consider.

  • 6. Review the entire list and narrow recommended

measures to targeted number of measures.

  • 7. Put out for public comment and then make final decisions.

21

slide-22
SLIDE 22

State Measure Set Alignment: Rhode Island

  • Work initiated in 2015 at the direction of the state’s SIM

Steering Committee.

  • Focus on adopting common commercial and Medicaid

measure sets to be used for financial consequences in value-based contracts.

  • Over nine months developed three measure sets:

– ACO, primary care and hospital – ACO measure set drawn from primary care and hospital measure sets

  • During 2016 two additional specialty measure sets:

– Maternity care – Behavioral health

22

slide-23
SLIDE 23

Rhode Island’s Process

  • Very similar to that of Washington, but no overseeing

coordinating committee or public comment process. Also, one work group developed the ACO, primary care and hospital measure sets.

  • Because only interested in measures for payment,

process somewhat simpler (also fewer actors in a much smaller state).

  • Consensus across commercial and Medicaid not

difficult – a few Medicaid-only measures adopted.

23

slide-24
SLIDE 24

Alignment Lessons Learned

  • 1. There needs to be a clear understanding of the

intended use(s) of the measure set.

  • 2. While employers sometimes feel that they have special

interests, in the end there are few health concerns specific to Medicaid and commercial populations.

  • 3. SB440 creates special challenges and opportunities

because the Metrics and Scoring Work Group and OHA have been so thoughtful about the CCO Incentive Measure Set.

  • It will be essential to convey the rationale for past decisions.
  • Having an existing measure set will be helpful to the new

work group.

24

slide-25
SLIDE 25

Questions and Discussion

25

slide-26
SLIDE 26

Contact Information

Michael Bailit

781-453-1166 mbailit@bailit-health.com

26

www.bailit-health.com/

slide-27
SLIDE 27

Discussion: Question #1

What should the role(s) of the Metrics & Scoring Committee be relative to the Health Plan Quality Metrics Committee? a) Consultant b) Reactor c) Delegate d) Other

27

slide-28
SLIDE 28

Discussion: Question #2

Should the Metrics & Scoring Committee’s relationship with the Health Plan Quality Metrics Committee be: a) Proactive b) Reactive c) Both

28

slide-29
SLIDE 29

Discussion: Question #3

What are possible activities that the Metrics & Scoring Committee could conduct in support of the Health Plan Quality Metrics Committee’s charge?

  • Identifying areas for improvement, or areas of high performance
  • Identifying opportunities across the state (beyond Medicaid)
  • New measure “R&D”
  • Monitoring changes in national measure sets and/or clinical

guidelines

  • Others?

29

slide-30
SLIDE 30

Discussion: Question #4

How should the Metrics & Scoring Committee interact with the Health Plan Quality Metrics Committee?

  • How often should the Committees interact?
  • Through what modalities?
  • Liaison role?
  • Other?

30

slide-31
SLIDE 31

Discussion: Question #5

What information and advice should the Metrics & Scoring Committee share with the Health Plan Quality Metrics Committee to help inform their work, based on the past four years of experience?

  • Lessons learned in doing this work?
  • Process recommendations?
  • What do you wish you had known or done differently?

31

slide-32
SLIDE 32

32

slide-33
SLIDE 33

MEASURE ALIGNMENT

Part 2

33

slide-34
SLIDE 34

34

slide-35
SLIDE 35

35

2018+ PROGRAM STRUCTURE

slide-36
SLIDE 36

2017-2022 Waiver Application

Oregon proposes continuing the CCO incentive program, under the guidance of the legislatively-established committees. Future work:

  • Use CCO incentive measure program structure to further health

system transformation by developing and adopting more transformational and outcome-based measures, rather than traditional process measures.

  • Explore changes to the payment structure to better support priorities

areas.

http://www.oregon.gov/oha/hpa/Medicaid-1115-Waiver/Documents/Waiver- Renewal-Appendices.pdf

36

slide-37
SLIDE 37

2017-2022 Waiver Application: areas for measurement

  • Quality improvement focus areas
  • Quality
  • Access
  • Population health and health outcomes
  • Integration
  • Behavioral health
  • Oral health
  • Social determinants of health
  • Collaboration with other systems, esp. early learning / housing

37

slide-38
SLIDE 38

New Incentive Program Structure: Measure Set

At their 2015 retreat, the Committee discussed moving to a core and menu measure set structure.

  • Core measures = would be incentivized for all CCOs
  • Menu measures = CCOs could select additional incentive measures

from menu based on their needs and local priorities.

38

slide-39
SLIDE 39

Measure Set

In January, the Committee discussed how the structure of the CCO incentive program might be modified under the new waiver (2018+).

Discussion included:

  • Measurement fatigue and potentially reducing the total number of measures
  • Measure alignment across multiple payers
  • Desire to support local priorities and needs
  • Which measures are most meaningful for a transformed system of care
  • Striking a balance between global outcomes and ability to affect the

measure

  • Balance between nationally-endorsed measures and measures that

address transformation and integration

  • Avoiding a competition between interests to ensure vulnerable populations

receive services

39

slide-40
SLIDE 40

Measure Set: Stakeholder Survey

If the Committee moves to a core / menu measure set, which model is most appealing?

40

35.3% 35.3% 29.4% More core measures + fewer menu measures Fewer core measures + more menu measures Equal numbers of core + menu measures

N=51

slide-41
SLIDE 41

Measure Set: Stakeholder Survey

Core Measures

  • Address population health /
  • utcomes
  • Greatest impact / most vulnerable

populations

  • Where progress needs to be

made / trending in wrong direction

  • Have actionable data / monitored

during measurement year

  • Have larger denominators / more

representative of population

  • Have high clinical value

Menu Measures

  • Local priorities
  • Process measures
  • Affect specific / smaller

populations (e.g., children in foster care)

  • Historically challenging to improve
  • n

41

Criteria for deciding which measures are core vs menu?

slide-42
SLIDE 42

Measure Set: Concerns

  • CCOs would select menu measures that are most obtainable,

rather than those best supporting health.

  • CCOs would not include community or stakeholders in menu

measure selection process.

42

slide-43
SLIDE 43

Discussion: Question #1

What are the benefits of implementing a core and menu measure set?

43

slide-44
SLIDE 44

Discussion: Question 2

What are some of the key considerations for

  • perationalizing a core and menu measure set?
  • When should core / menu be implemented?
  • What is the right balance of measures in each set?
  • What criteria should the Committee use in identifying

core measures?

  • What criteria should CCOs use in selecting menu

measures?

44

slide-45
SLIDE 45

Benchmark Structure

Since 2013, CCOs have been eligible to earn incentive payments if they meet the benchmark or improvement target on at least 75% of the incentive measures.

  • Benchmark = absolute, same for all CCOs. National comparator if

available.

  • Improvement target = custom for all CCOs. Reduces the gap

between their prior year performance and benchmark. This approach was deliberately selected to ensure that CCOs far from the benchmark would not be dis-incentivized from participating in improvement activities.

45

slide-46
SLIDE 46

Benchmark Structure: Challenge #1

Highest performance across all CCOs does not automatically result in payment if improvement target or benchmark not met. Example: 2014 Adolescent Well Care Visits

46

slide-47
SLIDE 47

Benchmark Structure: Challenge #2

CCO performance can decline from prior year, but if they are above the benchmark, will still earn payment. Example: 2015 Developmental Screening

47

slide-48
SLIDE 48

Discussion: Question #3

What are the benefits of retaining the existing benchmark + improvement target structure?

48

slide-49
SLIDE 49

Alternate Structures

49

slide-50
SLIDE 50

Massachusetts Methodology (current)

Massachusetts BCBS Alternate Quality Contract:

– Continuum of performance targets for each measure (“good to great”) – No improvement target, but receive score depending on where positioned between Gates 1 and 5. http://content.healthaffairs.org/content/30/1/51.full

50

slide-51
SLIDE 51

Maryland Methodology (current)

  • 1. Targets for the current performance year are based on

the enrollment‐weighted performance average of all managed care organizations from two years prior (the base year). The enrollment weight assigned to each managed care organization is the 12‐month average enrollment of the base year.

  • 2. The midpoint of the incentive and disincentive

benchmarks of each measure is the sum of the weighted average of managed care organization performance on that measure in the base year and 15% of the difference between that number and 100%.

51

slide-52
SLIDE 52

Maryland Methodology (current)

  • 3. The incentive benchmark is the sum of the midpoint

and 10% of the difference between the midpoint and 100%.*

  • 4. The disincentive benchmark is equal to the midpoint

minus 10% of the difference between the midpoint and 100%.

  • 5. If the difference between the incentive threshold and

disincentive threshold is less than 4 percentage points, then the incentive and disincentive thresholds will be the midpoint +/‐2 percentage points.

52

*Incentives and disincentives are rounded to the nearest 1/100th (e.g., .81253=81%).

slide-53
SLIDE 53

Maryland Methodology (current)

  • 6. Financially, one point is equal to up to 1/13 of 1

percent of the total capitation amount paid to the MCO during the same measurement year.

Example of Benchmark Calculations

  • Member Weighted MCO Avg. from Base Year:

75% X

  • New Mid‐Point will be (Y=X+((100‐X)*0.15))

78.75% Y

  • Incentive will be (I=Y+((100‐Y)*0.10))

81% I

  • Disincentive will be (D=Y‐((100‐Y)*0.10))

77% D

53

slide-54
SLIDE 54

Washington State Methodology (new in 2017)

  • The Health Care Authority withholds 1% of the medical

portion of the monthly premium payment.

– Up to 12.5% of 1% may be earned back by making qualifying provider incentive payments – Up to 12.5% of 1% may be earned by having value- based purchasing arrangements – Up to 75% of 1% may be earned by achieving quality improvement targets

54

slide-55
SLIDE 55

Washington State Methodology (new in 2017)

  • provider incentive payments
  • At least 0.75% of the medical portion of the annual premium

payment received by the contractor under this contract from January through December, 2017, must be paid exclusively for provider incentive payments in a value-based payment arrangement, as defined by Category 2C or higher of the HCP-LAN framework.

  • value-based purchasing arrangements
  • At least 30% of provider payments are paid to network

providers in the form of value-based payments as in contracts signed with an effective date in January 2018 that meet Category 2C or higher as defined by the Health Care Payment Learning & Action Network (HCPLAN) Alternative Payment Model (APM) Framework

55

slide-56
SLIDE 56

Washington State Methodology (new in 2017)

  • quality improvement targets

– “On an annual basis, HCA shall determine the percentage of the withhold earned back by the Contractor based on the Contractor’s achieving Quality Improvement Score (QIS) targets as described in this section. – Each Quality Measure will be calculated for the performance Year Vi(y) and for the year prior to the Performance Year VI(y-1). The Quality Improvement Score QIS(i) will be calculated as specified in this Section…”

56

slide-57
SLIDE 57

Washington State Methodology (new in 2017)

57

‐200% ‐175% ‐150% ‐125% ‐100% ‐75% ‐50% ‐25% 0% 25% 50% 75% 100% 125% 150% 175% 200% 0% 4% 8% 12% 16% 20% 24% 28% 32% 36% 40% 44% 48% 52% 56% 60% 64% 68% 72% 76% 80% 84% 88% 92% 96% 100%

Total (Improvement plus Quality Score)

Total Improvement Quality

slide-58
SLIDE 58

Washington State Methodology (new in 2017)

0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 0.10 0.11 0.12 0.13 0.14 0.15 0.16 0.17 0.18 0.01 1.52 1.93 1.99 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 0.02 0.92 1.52 1.81 1.93 1.97 1.99 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 0.03 0.64 1.17 1.52 1.74 1.86 1.93 1.96 1.98 1.99 1.99 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 0.04 0.49 0.92 1.27 1.52 1.70 1.81 1.88 1.93 1.96 1.97 1.98 1.99 1.99 2.00 2.00 2.00 2.00 2.00 0.05 0.39 0.76 1.07 1.33 1.52 1.67 1.77 1.84 1.89 1.93 1.95 1.97 1.98 1.99 1.99 1.99 2.00 2.00 0.06 0.33 0.64 0.92 1.17 1.36 1.52 1.65 1.74 1.81 1.86 1.90 1.93 1.95 1.96 1.97 1.98 1.99 1.99 0.07 0.28 0.56 0.81 1.03 1.23 1.39 1.52 1.63 1.72 1.78 1.83 1.87 1.90 1.93 1.95 1.96 1.97 1.98 0.08 0.25 0.49 0.72 0.92 1.11 1.27 1.41 1.52 1.62 1.70 1.76 1.81 1.85 1.88 1.91 1.93 1.94 1.96 0.09 0.22 0.44 0.64 0.83 1.01 1.17 1.30 1.42 1.52 1.61 1.68 1.74 1.79 1.83 1.86 1.89 1.91 1.93 0.10 0.20 0.39 0.58 0.76 0.92 1.07 1.21 1.33 1.43 1.52 1.60 1.67 1.72 1.77 1.81 1.84 1.87 1.89 0.11 0.18 0.36 0.53 0.70 0.85 0.99 1.12 1.24 1.35 1.44 1.52 1.59 1.66 1.71 1.75 1.79 1.83 1.85 0.12 0.17 0.33 0.49 0.64 0.79 0.92 1.05 1.17 1.27 1.36 1.45 1.52 1.59 1.65 1.70 1.74 1.78 1.81 0.13 0.15 0.31 0.45 0.60 0.73 0.86 0.98 1.10 1.20 1.29 1.38 1.45 1.52 1.58 1.64 1.69 1.73 1.76 0.14 0.14 0.28 0.42 0.56 0.69 0.81 0.92 1.03 1.13 1.23 1.31 1.39 1.46 1.52 1.58 1.63 1.68 1.72 0.15 0.13 0.27 0.39 0.52 0.64 0.76 0.87 0.98 1.07 1.17 1.25 1.33 1.40 1.46 1.52 1.58 1.62 1.67 0.16 0.12 0.25 0.37 0.49 0.61 0.72 0.82 0.92 1.02 1.11 1.19 1.27 1.34 1.41 1.47 1.52 1.57 1.62 0.17 0.12 0.23 0.35 0.46 0.57 0.68 0.78 0.88 0.97 1.06 1.14 1.22 1.29 1.35 1.42 1.47 1.52 1.57 0.18 0.11 0.22 0.33 0.44 0.54 0.64 0.74 0.83 0.92 1.01 1.09 1.17 1.24 1.30 1.36 1.42 1.47 1.52 0.19 0.11 0.21 0.31 0.41 0.51 0.61 0.71 0.80 0.88 0.97 1.04 1.12 1.19 1.25 1.32 1.37 1.43 1.48

58

slide-59
SLIDE 59

Iowa Methodology (formerly)

  • Certain performance indicators, considered by the

Departments to be especially significant to either the quality of treatment or efficiency of administration of the Iowa Plan, will be selected annually as indicators to which either incentives and/or disincentives are attached.

– Incentive measures – Disincentive measures – Incentive and disincentive measures

59

slide-60
SLIDE 60

Other Potential Changes

  • Benchmark only approach
  • “Must pass” measures
  • No tiered distribution, e.g., move from 75% of measures =

100% of dollars  each measure = x $.

60

slide-61
SLIDE 61

Discussion

61

slide-62
SLIDE 62

62

slide-63
SLIDE 63

63

EQUITY MEASURE

slide-64
SLIDE 64

Next Meeting: December 16th 9 am – noon, Wilsonville Agenda: dental measures, equity measure

64