Metrics & Scoring Committee
Retreat, December 2, 2016
1
Metrics & Scoring Committee Retreat, December 2, 2016 1 - - PowerPoint PPT Presentation
Metrics & Scoring Committee Retreat, December 2, 2016 1 Consent Agenda Welcome and introductions Review October minutes Approve at December 16 th meeting Agenda review 2 Committee Vision Continue to lead on and expand influence
Retreat, December 2, 2016
1
Welcome and introductions Review October minutes Approve at December 16th meeting Agenda review
2
Continue to lead on and expand influence of incentive measures to improve the health of Oregonians, through Health System Transformation and cross-system collaboration
3
Part 1
4
5
Metrics & Scoring Committee
Metrics Technical Advisory Workgroup Public Testimony: advocates,
Stakeholder Input: Providers, CAPs, CACs, community
6
Metrics & Scoring Committee
Metrics Technical Advisory Workgroup Public Testimony: advocates, organizations, CCOs, providers Stakeholder Input: Providers, CAPs, CACs, community
Health Plan Quality Metrics Committee Public Testimony
7
Health Plan Quality Metrics Committee
Develops menu of aligned measures Metrics & Scoring Committee PEBB Board OEBB Board DCBS CCOs PEBB Carriers OEBB Carriers Health Plans on the Exchange
Providers
“The committee shall use a public process that includes an opportunity for public comment to identify health outcome and quality measures that may be applied to services provided by CCOs or paid for by health benefit plans sold through the health insurance exchange or offered by OEBB or PEBB. OHA, DCBS, OEBB, and PEBB are not required to adopt all of the measures identified by the committee, but may not adopt any measures that are different from the measures identified by the committee. The measures must take into account the recommendations of the metrics and scoring subcommittee and the differences in the populations served by CCOs and by commercial insurers.”
8
Oregon CCO Metrics and Scoring Committee
December 2, 2016
Michael Bailit
48 measure sets used by 25 states for a range of purposes
– Focus was on ambulatory care measures, i.e., no hospital, dental or LTSS measures – Analysis included Oregon’s CCO and PCPCH measures
what extent measure sets were aligned across the country and within individual states
10
Bazinsky K and Bailit M. The Significant Lack of Alignment Across State and Regional Health Measure Sets Bailit Health Purchasing, LLC September 10, 2013. See http://www.buyingvalue.org/.
1367 measures across the 48 sets,
were distinct.
measures, only 20% were used by more than one program.
11
Measures not shared 80% 2 sets, 5% (28 measures) 3-5 sets, 4% (20 measures) 6-10 sets, 4% (21 measures) 11-15 sets, 3% (14 measures) 16-30 sets, 4% (19 measures) Shared measures 20%
12
Only 19 measures were shared by at least 1/3 (16+) of the measure sets
Most measures are not shared
Not that often…
Standard 59% Modified 17% Home- grown 15% Undeter- mined 6% Other 3%
13
Measures by measure type n = 1367
Standard: measures from a nat’l source (e.g., NCQA, AHRQ) Modified: standard measures with a change to the traditional specifications Homegrown: measures that were indicated on the source document as having been created by the developer of the measure set Undetermined: measures that were not indicated as “homegrown”, but for which the source could not be identified Other: a measure bundle or composite
Defining Terms
14
Program A Program B Program C Program D Program E
standard, NQF-endorsed measures, they were not selecting the same standard measures
– Breast Cancer Screening was the most frequently used measure and it was used by only 30 of the programs (63%)
they had no data to assess the value generated by their health plans.
work with a small group of large employers to identify standard measures that could be used to demonstrate value.
released in 1991 with a limited number of quality measures – most of which were focused on prevention.
15
measures!
measure sets and selected other two national measure sets yielded 470 distinct measures in use – not including the MIPS specialist measures.
endorsed by NQF.
16
Federal Sets:
Set)
Medicine, and Pediatrics only
(eCQMs), CMS Cross Cutting Measures
Non-Federal National Sets:
17
measures they are being asked to improve.
for coding claims and medical records is a significant contributor to primary care burnout.
measures are directed to primary and not specialty care.
18
– AHIP/CMS Core Measure Set: developed with NQF with no provider, state, consumer or employer involvement – CPR Employer-Purchaser Measure Set: developed for purchasers – IOM’s Core Metrics: a conceptual framework more than a measure set – Many Medicare measures: MACRA/MIPS, MSSP, Stars, etc. create an impetus for multi-payer use but they don’t fit Medicaid and commercial populations very well
to find a way to create measure rationality.
19
statewide core measure set led to creation of the Washington State Common Measure Set for Health Care Quality and Cost
months (modified in 2016 to 55 with added behavioral health measures).
groups: acute care, chronic care and hospital care.
monitoring, public health use...and more.
20
Washington State and in national measure sets.
and other measures for review.
include the measure (yes/maybe/no).
members and non-group members and determine whether to consider.
measures to targeted number of measures.
21
Steering Committee.
measure sets to be used for financial consequences in value-based contracts.
– ACO, primary care and hospital – ACO measure set drawn from primary care and hospital measure sets
– Maternity care – Behavioral health
22
coordinating committee or public comment process. Also, one work group developed the ACO, primary care and hospital measure sets.
process somewhat simpler (also fewer actors in a much smaller state).
difficult – a few Medicaid-only measures adopted.
23
intended use(s) of the measure set.
interests, in the end there are few health concerns specific to Medicaid and commercial populations.
because the Metrics and Scoring Work Group and OHA have been so thoughtful about the CCO Incentive Measure Set.
work group.
24
25
Michael Bailit
781-453-1166 mbailit@bailit-health.com
26
www.bailit-health.com/
What should the role(s) of the Metrics & Scoring Committee be relative to the Health Plan Quality Metrics Committee? a) Consultant b) Reactor c) Delegate d) Other
27
Should the Metrics & Scoring Committee’s relationship with the Health Plan Quality Metrics Committee be: a) Proactive b) Reactive c) Both
28
What are possible activities that the Metrics & Scoring Committee could conduct in support of the Health Plan Quality Metrics Committee’s charge?
guidelines
29
How should the Metrics & Scoring Committee interact with the Health Plan Quality Metrics Committee?
30
What information and advice should the Metrics & Scoring Committee share with the Health Plan Quality Metrics Committee to help inform their work, based on the past four years of experience?
31
32
Part 2
33
34
35
Oregon proposes continuing the CCO incentive program, under the guidance of the legislatively-established committees. Future work:
system transformation by developing and adopting more transformational and outcome-based measures, rather than traditional process measures.
areas.
http://www.oregon.gov/oha/hpa/Medicaid-1115-Waiver/Documents/Waiver- Renewal-Appendices.pdf
36
37
At their 2015 retreat, the Committee discussed moving to a core and menu measure set structure.
from menu based on their needs and local priorities.
38
In January, the Committee discussed how the structure of the CCO incentive program might be modified under the new waiver (2018+).
Discussion included:
measure
address transformation and integration
receive services
39
If the Committee moves to a core / menu measure set, which model is most appealing?
40
35.3% 35.3% 29.4% More core measures + fewer menu measures Fewer core measures + more menu measures Equal numbers of core + menu measures
N=51
Core Measures
populations
made / trending in wrong direction
during measurement year
representative of population
Menu Measures
populations (e.g., children in foster care)
41
Criteria for deciding which measures are core vs menu?
rather than those best supporting health.
measure selection process.
42
What are the benefits of implementing a core and menu measure set?
43
What are some of the key considerations for
core measures?
measures?
44
Since 2013, CCOs have been eligible to earn incentive payments if they meet the benchmark or improvement target on at least 75% of the incentive measures.
available.
between their prior year performance and benchmark. This approach was deliberately selected to ensure that CCOs far from the benchmark would not be dis-incentivized from participating in improvement activities.
45
Highest performance across all CCOs does not automatically result in payment if improvement target or benchmark not met. Example: 2014 Adolescent Well Care Visits
46
CCO performance can decline from prior year, but if they are above the benchmark, will still earn payment. Example: 2015 Developmental Screening
47
What are the benefits of retaining the existing benchmark + improvement target structure?
48
49
Massachusetts BCBS Alternate Quality Contract:
– Continuum of performance targets for each measure (“good to great”) – No improvement target, but receive score depending on where positioned between Gates 1 and 5. http://content.healthaffairs.org/content/30/1/51.full
50
the enrollment‐weighted performance average of all managed care organizations from two years prior (the base year). The enrollment weight assigned to each managed care organization is the 12‐month average enrollment of the base year.
benchmarks of each measure is the sum of the weighted average of managed care organization performance on that measure in the base year and 15% of the difference between that number and 100%.
51
and 10% of the difference between the midpoint and 100%.*
minus 10% of the difference between the midpoint and 100%.
disincentive threshold is less than 4 percentage points, then the incentive and disincentive thresholds will be the midpoint +/‐2 percentage points.
52
*Incentives and disincentives are rounded to the nearest 1/100th (e.g., .81253=81%).
percent of the total capitation amount paid to the MCO during the same measurement year.
Example of Benchmark Calculations
75% X
78.75% Y
81% I
77% D
53
portion of the monthly premium payment.
– Up to 12.5% of 1% may be earned back by making qualifying provider incentive payments – Up to 12.5% of 1% may be earned by having value- based purchasing arrangements – Up to 75% of 1% may be earned by achieving quality improvement targets
54
payment received by the contractor under this contract from January through December, 2017, must be paid exclusively for provider incentive payments in a value-based payment arrangement, as defined by Category 2C or higher of the HCP-LAN framework.
providers in the form of value-based payments as in contracts signed with an effective date in January 2018 that meet Category 2C or higher as defined by the Health Care Payment Learning & Action Network (HCPLAN) Alternative Payment Model (APM) Framework
55
– “On an annual basis, HCA shall determine the percentage of the withhold earned back by the Contractor based on the Contractor’s achieving Quality Improvement Score (QIS) targets as described in this section. – Each Quality Measure will be calculated for the performance Year Vi(y) and for the year prior to the Performance Year VI(y-1). The Quality Improvement Score QIS(i) will be calculated as specified in this Section…”
56
57
‐200% ‐175% ‐150% ‐125% ‐100% ‐75% ‐50% ‐25% 0% 25% 50% 75% 100% 125% 150% 175% 200% 0% 4% 8% 12% 16% 20% 24% 28% 32% 36% 40% 44% 48% 52% 56% 60% 64% 68% 72% 76% 80% 84% 88% 92% 96% 100%
Total (Improvement plus Quality Score)
Total Improvement Quality
0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 0.10 0.11 0.12 0.13 0.14 0.15 0.16 0.17 0.18 0.01 1.52 1.93 1.99 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 0.02 0.92 1.52 1.81 1.93 1.97 1.99 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 0.03 0.64 1.17 1.52 1.74 1.86 1.93 1.96 1.98 1.99 1.99 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 0.04 0.49 0.92 1.27 1.52 1.70 1.81 1.88 1.93 1.96 1.97 1.98 1.99 1.99 2.00 2.00 2.00 2.00 2.00 0.05 0.39 0.76 1.07 1.33 1.52 1.67 1.77 1.84 1.89 1.93 1.95 1.97 1.98 1.99 1.99 1.99 2.00 2.00 0.06 0.33 0.64 0.92 1.17 1.36 1.52 1.65 1.74 1.81 1.86 1.90 1.93 1.95 1.96 1.97 1.98 1.99 1.99 0.07 0.28 0.56 0.81 1.03 1.23 1.39 1.52 1.63 1.72 1.78 1.83 1.87 1.90 1.93 1.95 1.96 1.97 1.98 0.08 0.25 0.49 0.72 0.92 1.11 1.27 1.41 1.52 1.62 1.70 1.76 1.81 1.85 1.88 1.91 1.93 1.94 1.96 0.09 0.22 0.44 0.64 0.83 1.01 1.17 1.30 1.42 1.52 1.61 1.68 1.74 1.79 1.83 1.86 1.89 1.91 1.93 0.10 0.20 0.39 0.58 0.76 0.92 1.07 1.21 1.33 1.43 1.52 1.60 1.67 1.72 1.77 1.81 1.84 1.87 1.89 0.11 0.18 0.36 0.53 0.70 0.85 0.99 1.12 1.24 1.35 1.44 1.52 1.59 1.66 1.71 1.75 1.79 1.83 1.85 0.12 0.17 0.33 0.49 0.64 0.79 0.92 1.05 1.17 1.27 1.36 1.45 1.52 1.59 1.65 1.70 1.74 1.78 1.81 0.13 0.15 0.31 0.45 0.60 0.73 0.86 0.98 1.10 1.20 1.29 1.38 1.45 1.52 1.58 1.64 1.69 1.73 1.76 0.14 0.14 0.28 0.42 0.56 0.69 0.81 0.92 1.03 1.13 1.23 1.31 1.39 1.46 1.52 1.58 1.63 1.68 1.72 0.15 0.13 0.27 0.39 0.52 0.64 0.76 0.87 0.98 1.07 1.17 1.25 1.33 1.40 1.46 1.52 1.58 1.62 1.67 0.16 0.12 0.25 0.37 0.49 0.61 0.72 0.82 0.92 1.02 1.11 1.19 1.27 1.34 1.41 1.47 1.52 1.57 1.62 0.17 0.12 0.23 0.35 0.46 0.57 0.68 0.78 0.88 0.97 1.06 1.14 1.22 1.29 1.35 1.42 1.47 1.52 1.57 0.18 0.11 0.22 0.33 0.44 0.54 0.64 0.74 0.83 0.92 1.01 1.09 1.17 1.24 1.30 1.36 1.42 1.47 1.52 0.19 0.11 0.21 0.31 0.41 0.51 0.61 0.71 0.80 0.88 0.97 1.04 1.12 1.19 1.25 1.32 1.37 1.43 1.48
58
Departments to be especially significant to either the quality of treatment or efficiency of administration of the Iowa Plan, will be selected annually as indicators to which either incentives and/or disincentives are attached.
– Incentive measures – Disincentive measures – Incentive and disincentive measures
59
100% of dollars each measure = x $.
60
61
62
63
64