Developmental Evaluations March 2020 This presentation is made - - PowerPoint PPT Presentation

developmental evaluations
SMART_READER_LITE
LIVE PREVIEW

Developmental Evaluations March 2020 This presentation is made - - PowerPoint PPT Presentation

Evaluation Caf: A Study of Three Developmental Evaluations March 2020 This presentation is made possible by the support of the American People through the United States Agency for International Development (USAID.) The contents of this


slide-1
SLIDE 1

1

Evaluation Café: A Study of Three Developmental Evaluations

This presentation is made possible by the support of the American People through the United States Agency for International Development (USAID.) The contents of this presentation are the sole responsibility of DEPA-MERL Consortium and do not necessarily reflect the views of USAID or the United States Government.

March 2020

slide-2
SLIDE 2

2

Who we are

slide-3
SLIDE 3

3

Before we begin,

  • How familiar are you with Developmental Evaluation (DE)?
  • How familiar are you with Outcome Harvesting?
  • How many are interested in international evaluation work?
slide-4
SLIDE 4

4

  • Introduction to DEPA-MERL
  • Overview of research questions & methods
  • Key lessons learned
  • Audience questions

Our agenda today,

slide-5
SLIDE 5

5

  • WDI working with USAID since 2015
  • MERLIN Program:

– Innovate on traditional approaches to monitoring, evaluation, research and learning (MERL)

  • DEPA-MERL is a consortium under

MERLIN that implements developmental evaluations in the USAID context

Introduction to DEPA-MERL

slide-6
SLIDE 6

6

  • Supports the continuous

adaptation of development interventions in complex environments

  • Developmental Evaluators are

“embedded” within the program – Use a variety of M&E approaches

What is developmental evaluation?

slide-7
SLIDE 7

7

How we applied DE

  • Three DE pilots were conducted

Family Care First in Cambodia

Sustained Uptake DE, the US Global Development Lab (Washington, DC)

Bureau for Food Security (Washington, DC)

  • Each DE pilot included:

1 full-time, external Developmental Evaluator

A “DE Administrator” from DEPA-MERL

slide-8
SLIDE 8

8

Our research questions

INSERT DE RQ QUESTIONS TABLE FROM IP AAP REPORT

Research question Methods Data sources

1: How does DE capture, promote, and enable the utilization of emergent learnings in support of ongoing programming in a complex system, in the USAID context? Outcome harvesting (qualitative)

  • Monthly reflection interviews with two Developmental

Evaluators (n=35)

  • Substantiation interviews with keys stakeholders at

endline of two DEs (n=26)

  • Document reviews, as required

2: What are the barriers and enablers to implementation of DE in the USAID context? Semi- structured interviews (qualitative)

  • Monthly reflection interviews with two Developmental

Evaluators (n=35)

  • Substantiation interviews with keys stakeholders at

endline of two DEs (n=26) 3: What is the perceived value of conducting a DE, especially versus a traditional evaluation approach? Survey (quantitative and qualitative)

  • Value of Developmental Evaluation Survey with DE

stakeholders at endline (n=30)

slide-9
SLIDE 9

9

Outcome harvesting approach

slide-10
SLIDE 10

10

What was the most surprising finding from our research on DE?

Ben White

slide-11
SLIDE 11

11

What is the biggest difference between DE in theory and DE in practice?

slide-12
SLIDE 12

12

If you were approached by new DE implementer, what advice would you give them about how to successfully launch and oversee a DE?

https://bit.ly/32kJGOA

slide-13
SLIDE 13

13

What is the most controversial idea or topic in DE right now?

Diego PH

slide-14
SLIDE 14

14

Resources!

https://bit.ly/2JXCUru https://bit.ly/2VkTbgO

slide-15
SLIDE 15

15

Questions?

slide-16
SLIDE 16

16

USAID-MERLIN (DEPA-MERL) Shannon Griswold, sgriswold@usaid.gov WDI WDI-performancemeasurement@umich.edu

Contact us!

WDI Website https://bit.ly/2NtuVU7

slide-17
SLIDE 17

17

Annex

slide-18
SLIDE 18

18

What is developmental evaluation?

  • Development Evaluation (DE) is an approach to evaluation that supports the

continuous adaptation of development interventions.

  • DE provides evaluative thinking and timely feedback to inform ongoing adaptation as

needs, findings, and insights emerge in complex dynamic situations.

  • The DE helps facilitate the process from findings to action in a collaborative process

with the DE stakeholders.

slide-19
SLIDE 19

Realized Outcomes Intended Outcomes Emergent Outcomes Unrealized Outcomes Implemented Outcomes

Source: Henry Mintzberg, Sumatra Ghoshal, and James B. Quinn, The Strategy Process, Prentice Hall, 1998.

What outcomes can you expect from DE?

slide-20
SLIDE 20

20

Traditional Evaluation Developmental Evaluation Purpose Supports improvement, summative test and accountability Supports development of innovation and adaptation in dynamic environments Standards Methodological competence and commitment to rigor, independence, credibility with external authorities Methodological flexibility and adaptability; systems thinking, creative and critical thinking balanced; high tolerance for ambiguity; able to facilitate rigorous evidence-based perspectives Options Traditional research and disciplinary standards of quality dominate options Utilization focused: options are chosen in service to developmental use Evaluation Results Detailed formal reports; validated best practices, generalizable across time and space. Rapid, real time feedback; diverse, user- friendly forms of feedback. Evaluation aims to nurture learning.

How is DE different?

slide-21
SLIDE 21
  • The DE evaluator works

collaboratively with implementing teams to conceptualize, design, and test new approaches in a long-term,

  • ngoing process of adaptation,

intentional change, and development.

  • The DE evaluator thinks and

engages evaluatively; questions assumptions; applies evaluation logic; uses appropriate methods; and stays empirically grounded— that is, rigorously gathers, interprets, and reports data.

21

What is a developmental evaluator?

slide-22
SLIDE 22

22

Analysis Categories for the Harvested Outcomes (RQ1)

slide-23
SLIDE 23

Factor Percent of all enablers* Percent of all barriers* Skills of the Developmental Evaluator 17% 8% Data collection and sharing 16% 10% Data utilization 12% 8% Integration of the Developmental Evaluator 11% 10% Leadership (of program being evaluated) 11% 15% Stakeholder relationships 9% 11% DE readiness 8% 8% USAID dynamics 7% 14% Funding dynamics 2% 6% Local and international dynamics 2% 4% * Percentages do not total 100% because only 10 of the 13 factors that were coded across DE pilots for are shown.

23

Barriers and Enablers (RQ2)

slide-24
SLIDE 24

24

Value of Developmental Evaluation Survey (RQ3) (1/2)

Takeaway: In all areas except DE cost and time savings, survey respondents said the DEs were better than traditional evaluation (n=29)

slide-25
SLIDE 25

25

Value of Developmental Evaluation Survey (RQ3) (2/2)

Takeaway: The overwhelming majority of respondents reported positive interactions with their Developmental Evaluators.

slide-26
SLIDE 26

26

Description of codes (1/2)

slide-27
SLIDE 27

27

Description of codes (2/2)

slide-28
SLIDE 28

28

The DEPA-MERL DE pilots