metrics scoring committee
play

Metrics & Scoring Committee Retreat, December 2, 2016 1 - PowerPoint PPT Presentation

Metrics & Scoring Committee Retreat, December 2, 2016 1 Consent Agenda Welcome and introductions Review October minutes Approve at December 16 th meeting Agenda review 2 Committee Vision Continue to lead on and expand influence


  1. Metrics & Scoring Committee Retreat, December 2, 2016 1

  2. Consent Agenda  Welcome and introductions  Review October minutes Approve at December 16 th meeting  Agenda review 2

  3. Committee Vision Continue to lead on and expand influence of incentive measures to improve the health of Oregonians, through Health System Transformation and cross-system collaboration - Metrics & Scoring Committee Retreat, Oct 2015 3

  4. Measure Alignment Part 1 4

  5. CCO Measure Selection: Current State Metrics & Scoring Public Testimony: advocates, organizations, CCOs, providers Committee Stakeholder Input: Metrics Technical Advisory Providers, CAPs, CACs, Workgroup community 5

  6. CCO Measure Selection: Future State Health Plan Quality Metrics Public Testimony Committee Public Testimony: Metrics & Scoring advocates, organizations, Committee CCOs, providers Stakeholder Input: Metrics Technical Advisory Providers, CAPs, CACs, Workgroup community 6

  7. Measure Alignment Health Plan Quality Metrics Committee Develops menu of aligned measures DCBS Metrics & Scoring PEBB Board OEBB Board Committee Health Plans on CCOs PEBB Carriers OEBB Carriers the Exchange Providers 7

  8. Establishing Legislation (SB 440, 2015) “The committee shall use a public process that includes an opportunity for public comment to identify health outcome and quality measures that may be applied to services provided by CCOs or paid for by health benefit plans sold through the health insurance exchange or offered by OEBB or PEBB. OHA, DCBS, OEBB, and PEBB are not required to adopt all of the measures identified by the committee, but may not adopt any measures that are different from the measures identified by the committee. The measures must take into account the recommendations of the metrics and scoring subcommittee and the differences in the populations served by CCOs and by commercial insurers.” 8

  9. Measure Alignment: The National Landscape Oregon CCO Metrics and Scoring Committee Michael Bailit December 2, 2016

  10. Historical Context: 2013 Study  RWJF-funded study identified and collected 48 measure sets used by 25 states for a range of purposes – Focus was on ambulatory care measures, i.e., no hospital, dental or LTSS measures – Analysis included Oregon’s CCO and PCPCH measures  Purpose of the measure was to determine to what extent measure sets were aligned across the country and within individual states Bazinsky K and Bailit M. The Significant Lack of Alignment Across State and Regional Health Measure Sets Bailit Health Purchasing, LLC September 10, 2013. See http://www.buyingvalue.org/. 10

  11. 2013 Study: How many measures?  The study identified 1367 measures across the 48 sets, of which 509 (37%) were distinct.  Of the 509 distinct measures, only 20% were used by more than one program. 11

  12. How often were the “shared measures” shared? Not that often… 11-15 sets, 3% (14 measures) 6-10 sets, 4% Measures not (21 measures) shared 80% Shared 3-5 sets, 4% 16-30 sets, 4% measures 20% (20 measures) (19 measures) 2 sets, 5% (28 measures) Only 19 measures Most measures were shared by at are not shared least 1/3 (16+) of the measure sets 12

  13. Non-alignment persisted despite a preference for standard measures Undeter- Other Defining Terms mined 3% Standard: measures from a nat’l 6% source (e.g., NCQA, AHRQ) Modified: standard measures Home- with a change to the traditional grown specifications 15% Homegrown: measures that were indicated on the source document Standard Modified 59% as having been created by the 17% developer of the measure set Undetermined: measures that were not indicated as “homegrown”, but for which the source could not be identified Measures by measure type Other: a measure bundle or n = 1367 composite 13

  14. Programs were selecting different subsets of standard measures  While the programs may have been primarily using standard, NQF-endorsed measures, they were not selecting the same standard measures  Not one measure was used by every program – Breast Cancer Screening was the most frequently used measure and it was used by only 30 of the programs (63%) Program Program C B Program A Program Program D E 14

  15. This phenomenon emerged relatively quickly  In 1989 large employers were complaining that they had no data to assess the value generated by their health plans.  In response a group of health plans decided to work with a small group of large employers to identify standard measures that could be used to demonstrate value .  The result of that effort - HEDIS “1.0” – was released in 1991 with a limited number of quality measures – most of which were focused on prevention. 15

  16. And now here we are today  25 years later, we are awash in measures!  A 2016 cataloguing of federal measure sets and selected other two national measure sets yielded 470 distinct measures in use – not including the MIPS specialist measures.  There have been 650 measures endorsed by NQF. 16

  17. Many National Measure Sets… Federal Sets:  CMMI Comprehensive Primary Care Plus (CPC+)  CMMI SIM Recommended Model Performance Metrics  CMS Core Set of Children’s Health Care Quality Measures for Medicaid and CHIP (Child Core Set)  CMS Core Set of Health Care Quality Measures for Adults Enrolled in Medicaid (Medicaid Adult Core Set)  CMS Core Quality Measures Collaborative - ACO and PCMH/Primary Care Measures  CMS Health Home Measure Set  CMS Hospital Value-Based Purchasing  CMS Medicare Hospital Compare  CMS Medicare-Medicaid Plans (MMPs) Capitated Financial Alignment Model (Duals Demonstrations)  CMS Medicare Part C & D Star Ratings Measures  CMS Medicare Shared Savings Program (MSSP) ACO  CMS Merit-based Incentive Payment System (MIPS) - General Practice/Family Medicine, Internal Medicine, and Pediatrics only  CMS Physician Quality Reporting System (PQRS), CMS EP EHR Incentive Clinical Quality Measures (eCQMs), CMS Cross Cutting Measures Non-Federal National Sets:  Catalyst for Payment Reform Employer-Purchaser Measure Set  Joint Commission Accountability Measure List 17

  18. The impact of non-alignment has been negative  Providers can’t respond to the large number of measures they are being asked to improve.  The measure-related requirements on providers for coding claims and medical records is a significant contributor to primary care burnout.  Why? Because a very high percentage of performance measures are directed to primary and not specialty care.  There is lots of frustration with “checking the boxes.”  The constant change in measure set composition only worsens the problem for providers. 18

  19. The response…measure set alignment  Troubled efforts at the national level. – AHIP/CMS Core Measure Set: developed with NQF with no provider, state, consumer or employer involvement – CPR Employer-Purchaser Measure Set: developed for purchasers – IOM’s Core Metrics: a conceptual framework more than a measure set – Many Medicare measures: MACRA/MIPS, MSSP, Stars, etc. create an impetus for multi-payer use but they don’t fit Medicaid and commercial populations very well  In the midst of this mess, states have been trying to find a way to create measure rationality. 19

  20. State Measure Set Alignment: Washington  2014 state legislation requiring development of a statewide core measure set led to creation of the Washington State Common Measure Set for Health Care Quality and Cost  “Starter measure set” of 52 measures developed in six months (modified in 2016 to 55 with added behavioral health measures).  Coordinating committee overseeing three technical work groups: acute care, chronic care and hospital care.  Broad scope - interested in measures for payment, monitoring, public health use...and more.  Final measure groupings: population, clinical and cost. 20

  21. Washington State’s Process 1. Review aligned measures already commonly used in Washington State and in national measure sets. 2. Agree upon key topic areas (domains) to organize those and other measures for review. 3. Review the entire list, by domain, and discuss whether to include the measure (yes/maybe/no). 4. Take a second pass through the “yes” and “maybe” lists. 5. Review additional measures recommended by group members and non-group members and determine whether to consider. 6. Review the entire list and narrow recommended measures to targeted number of measures. 7. Put out for public comment and then make final decisions. 21

  22. State Measure Set Alignment: Rhode Island  Work initiated in 2015 at the direction of the state’s SIM Steering Committee.  Focus on adopting common commercial and Medicaid measure sets to be used for financial consequences in value-based contracts.  Over nine months developed three measure sets: – ACO, primary care and hospital – ACO measure set drawn from primary care and hospital measure sets  During 2016 two additional specialty measure sets: – Maternity care – Behavioral health 22

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend