How Should CMMI Evaluations Attempt to Account for Model Overlap? - - PowerPoint PPT Presentation

how should cmmi evaluations attempt to
SMART_READER_LITE
LIVE PREVIEW

How Should CMMI Evaluations Attempt to Account for Model Overlap? - - PowerPoint PPT Presentation

How Should CMMI Evaluations Attempt to Account for Model Overlap? Workgroup members: Gregory Boyer, Susannah Cafardi, Philip Cotterill, Tim Day, Franklin Hendrick, Jennifer Lloyd, Patricia Markovich, Kelsey Weaver Objective Develop a strategy


slide-1
SLIDE 1

Workgroup members: Gregory Boyer, Susannah Cafardi, Philip Cotterill, Tim Day, Franklin Hendrick, Jennifer Lloyd, Patricia Markovich, Kelsey Weaver

How Should CMMI Evaluations Attempt to Account for Model Overlap?

slide-2
SLIDE 2

Develop a strategy to ide dentif ntify and and under dersta stand d the e impact ct of CMMI model tests in a constantly changing landscape that includes rule changes and co co-occ

  • ccurring

urring models dels and d initiativ itiatives. es.

Objective

slide-3
SLIDE 3

3

Goal:

  • Obtain accurate and unbiased estimate of the

impact of the model test on measures related to:

– Quality of care – Spending – Health Outcomes

Quantitative Methods:

  • Require a comparison group for credibility

– Compare estimates from the intervention group to that of a comparison group – The comparison group serves as the counterfactual of intervention group

Evaluation

slide-4
SLIDE 4

4

Counterfactual: What would have happened without the intervention?

slide-5
SLIDE 5

5

Considerations for Addressing Overlap in the Model Design Phase

slide-6
SLIDE 6

6

  • Need to know up front

t what questions we want to have answered and set up design in order to answer those questions

– Beneficiary Engagement and Incentives (BEI) Model

  • May require development of a model that is larger in
  • scope. Other factors may influence whether the

required sample to examine interactions is feasible or desired

– Comprehensive Primary Care Plus (CPC+)

  • Must also consider implications of model design

choices on existing models

The Impact of Model Design Decisions

slide-7
SLIDE 7

7

  • Design decisions can substantially impact the ability of

the comparison group to serve as a true counterfactual

  • Isolating the intervention’s impact requires comparison

group(s) of non-participants similar to intervention participants

  • Restricting eligibility for the model:

– Complicates comparison group construction

The Impact of Model Design Decisions

  • Allowing participation by the comparison group in the

prohibited overlapping initiative may lead to a comparison of the model vs. another intervention

  • However, applying the same restrictions may eliminate

too much of the similar population from comparison eligibility

slide-8
SLIDE 8

9

  • Improves comparability of

intervention and comparison groups on factors not matched on (not available in claims)

  • Generates an appropriate

counterfactual

  • Ensures intervention status

is independent of participation in other initiatives at baseline

– Helpful for those initiatives not in CMMI data

To Randomize or Not to Randomize?

That is the question.

  • Not always a viable option
  • May not fully address differences

between intervention and comparison group

  • Will require increased number to

reach the target of model

Potential Challenges Advantages

slide-9
SLIDE 9

10

  • Impossible to ascertain social needs data from claims

Randomization: Example

Identification of An Appropriate Comparison Group – Accountable Health Communities (AHC) Advan antag tages: es:

  • Intervention and comparison beneficiaries more likely to be similar

– Reduces need of data for matching

  • Improves ability to isolate impact of the model with increased level
  • f certainty and fewer caveats
  • Allows us to obtain data on comparison beneficiaries outside of

what is readily available in claims Chal hallen lenge ges:

  • Difficult for providers to identify a social need in

comparison beneficiaries without addressing it

  • Variation in how randomization is implemented
slide-10
SLIDE 10

11

Non-Randomization: Example

Identification of an Appropriate Comparison Group – Comprehensive Primary Care Plus (CPC+)

Pre-Intervention Landscape Performance Years Intervention Comparison CPC OCM MSSP CPC+ OCM MSSP CPC OCM MSSP OCM MSSP What are the implications

  • f MSSP

inclusion?

?

slide-11
SLIDE 11

12

When model participants are not reflective of the larger healthcare landscape (e.g. they are high performers, early adopters), this limits our ability to generalize to other populations and properly assess the scalability of the model.

Generalizability

slide-12
SLIDE 12

13

Considerations for Addressing Overlap in Evaluation Design Phase

slide-13
SLIDE 13

14

Similarity of the Initiatives and Expected Impacts on Outcomes

slide-14
SLIDE 14

The Counterfactual is not Static

slide-15
SLIDE 15

16

  • CMMI implements models in an ever changing healthcare landscape

– Drivers of change include the following:

  • Natural evolution of clinical practice
  • New models
  • Changing MA penetration rates
  • Medicare Programs
  • State health initiatives
  • If well-matched at baseline, changes observed in the comparison group

should reflect what would have happened to the intervention group in the absence of the intervention

– That being said, these changes have the potential to substantially impact the interpretation of the evaluation results

The Counterfactual is not Static

  • Qualitative data and thoughtful supplemental analyses are

essential for understanding the context within which the model is operating

slide-16
SLIDE 16

17

  • Randomized trial

– Half of enrollees received coordinated care management – Half received usual care

  • 2002-2010

– Individuals in the program had significantly lower hospitalizations and spending – Program was viewed as very successful and was continued

  • 2010-2014

– Program did not reduce hospitalizations or generate Medicare savings

Dynamic Counterfactual: Example

Importance of the Counterfactual– Health Quality Partners Medicare Coordinated Care Demonstration (MCCD) Site

The differences during the later period disappeared due to changes in “usual care” – with the ACA’s Hospital Readmissions Reduction Program and introduction of hospital care coordination programs.

slide-17
SLIDE 17

18

A strong qualitative approach can help inform and interpret the quantitative analyses

– What else is happening – what does the landscape look like?

  • How significantly are co-occurring initiatives impacting the

model and its outcomes?

– What is changing over time? – Do the results make sense given the timeframe of the intervention? – Could external factors be driving the results?

Telling the Story

The Role of Qualitative Data

slide-18
SLIDE 18

19

  • We must thoughtfully consider the role of co-occurring

initiatives throughout the design of the model and the evaluation

  • Making informed decisions and understanding their impact on

evaluation results requires the following:

– That we are asking the right questions – That we are able to determine what to do with the answers

  • The development and evaluation of each model face a unique

compilation of challenges related to co-occurring initiatives

  • Interpreting model results in the context of similar models with

similar goals often requires more framing and additional caveats

– I.e, it’s the effects of a given model compared to what, exactly?

Conclusions