Webinar on Meta-evaluation Approaches to Improve Evaluation Practice - - PowerPoint PPT Presentation

webinar on meta evaluation approaches to improve
SMART_READER_LITE
LIVE PREVIEW

Webinar on Meta-evaluation Approaches to Improve Evaluation Practice - - PowerPoint PPT Presentation

Webinar on Meta-evaluation Approaches to Improve Evaluation Practice Mnica Lomea Gelis, Maria Bustelo Ruesta, Principal Evaluation Officer at Director of the Master on Evaluation of Independent Development Evaluation of Zimbabwe


slide-1
SLIDE 1

:

Webinar on Meta-evaluation Approaches to Improve Evaluation Practice

25 October 2019

Zimbabwe Kenya

Mónica Lomeña Gelis, Principal Evaluation Officer at Independent Development Evaluation of the African Development Bank Group, Abidjan (Ivory Coast) Maria Bustelo Ruesta, Director of the Master on Evaluation of Programmes and Public Policies and Professor of Political Science and Public Administration of Universidad Complutense de Madrid, Spain

slide-2
SLIDE 2

Meta-evaluation: the concept

 Michael Scriven, “Thesaurus of Evaluation”:

“The evaluation of evaluations - indirectly, the evaluation of evaluators- represents both an ethical and scientific obligation when the wellbeing of others is at stake”.

 Joint Committee on Standards for Educational Evaluation:

  • 1994. Standard A12 Metaevaluation: “The evaluation itself shoud be formatively and summatively

evaluated against these and other pertinent standards, so that its conduct is appropiately guided and, on completion, stakeholders can closely examine its strengths and weaknesses”.

  • 2011. Standards E2 Internal Metaevaluation & E3 External Metaevaluation

 Michael Q. Patton:

“ The evaluation of the evaluation based on a series of norms and professional principles”.

 Cooksky & Caracelli:

“ Systematic reviews of evaluations to determine the quality of their processes and results”.

slide-3
SLIDE 3

There has been more focus on evaluation synthesis methodologies around evaluation results (Olsen & O’Reilly, 2011).

What are other meta-evaluative approaches?

Source: modified from (Olsen & O’Reilly, 2011).

Evaluation synthesis (synthesis evaluation)

Summarizing evaluation results

Narrative/Research review

Descriptive account for summarizing findings

Meta-analysis

Statistical procedure for comparing findings

  • f quantitative evaluations

Systematic review

Use of a rigorous peer-review protocol to summarize evidence around a research question

Meta-evaluation

Evaluation of evaluations (their designs, processes, results and utilization)

slide-4
SLIDE 4

EVALUATION SYNTHESIS Synthesizing evaluation RESULTS (from which meta-analysis is a type) The focus is on interventions and policies

METAEVALUATION Evaluation of evaluation PROCESSES (how evaluation is concived, done and used) The focus is on the evaluation of those interventions and policies

It is important to distinguish between two very different exercises:

Source: Bustelo, M. (2002) Meta-evaluation as a tool for the improvement and development of the evaluation function in public administrations, Paper presented at the European Evaluation Society Biennial Conference at Sevilla, Spain, October 2002. https://evaluationcanada.ca/distribution/20021010_bustelo_maria.pdf

slide-5
SLIDE 5

MEv functions

  • 1. Quality control of evaluations:

“¿Who evaluates the evaluator?” (Scriven). It is related to the control of the bias of the evaluator and to ensure the credibility

  • f evaluations .
  • 2. Comparative analysis of the evaluation function in variuos

countries (Rist, 1990, Ballart, 1993 and Derlien, 1998) More than focusing on the quality of the studied evaluations, it does it

  • n

their contribution to the development

  • f

that evaluation function in a policy field, an organization, institution

  • r political system.

https://www.collaborationprimer.ca/evaluation/

slide-6
SLIDE 6

MEv functions (II)

  • 3. Choice of which evaluation results can be synthesized

The knowledge about the quality

  • f

evaluations that MEv generates can be used to help in the decision making about what studies to be included in evaluation synthesis.

  • 4. Identification of evaluation training needs

The MEv of multiple studies help to identify the strengths and weaknesses

  • f

the evaluative practice in

  • rder

to develop evaluation capacity programmes.

slide-7
SLIDE 7

Types of MEv

Source: adapted from Bustelo, 2001; Bustelo, 2002; Stufflebeam, 1974 & 2001; Cooksy & Caracelli, 2005; Scriven, 20011; Yarbrough et al, 2011. https://usabilitygeek.com

slide-8
SLIDE 8

Types of MEv

Source: adapted from Buselo, 2001; Bustelo, 2002; Stufflebeam, 1974 & 2001; Cooksy & Caracelli, 2005; Scriven, 20011; Yarbrough et al, 2011. http://oeko.de

slide-9
SLIDE 9

Types of Mev (II)

Source: adapted from Buselo, 2001; Bustelo, 2002; Stufflebeam, 1974 & 2001; Cooksy & Caracelli, 2005; Scriven, 20011; Yarbrough et al, 2011.

slide-10
SLIDE 10

First example: MEv of gender policies in Spain

 Unity of analysis: eleven gender equality plans (evaluated or not), discourse analysis about

evaluation of the national agencies executing the gender plans.

  • 1. Evaluation planning

and evaluative strategies

  • Responsiveness to their context
  • Clarity of the evaluation objectives
  • Institutional structures for the evaluation
  • Different types of evaluations used
  • Resources used in evaluations
  • 2. Key elements of the

evaluations

⁻ Stakeholders involved in the evaluation processes ⁻ Moment and timing of the evaluation ⁻ Evaluation criteria and indicators ⁻ Procedures and tools for data collection and analysis

  • 3. Utilization and impact
  • f evaluations

⁻ Adequacy and usefulness of the produced information ⁻ Communication and dissemination of evaluation results ⁻ Impact of the evaluation in policies and organizations

Meta-evaluation criteria (analysis dimensions)

slide-11
SLIDE 11

1. The centrality of the evaluation process in the institution conducting the evaluation; 2. Responsiveness of the evaluation to the plan or policy context and clarity (explicit) of the evaluation purposes; 3. Clarity and centrality of evaluation criteria (of what is evaluated). The techniques for data collection and analysis should be chosen after the evaluation criteria are defined, and not vice versa; 4. Adequate management of evaluation resources, including (i) a good use of the different types of evaluation, (ii) the existence of adequate co-ordination structures which allow a reliable and collaborative information gathering, (iii) a good management of times and timetables, (iv) enough resources investment in evaluation; 5. Enough elaboration of the gathered information during the evaluation processes (systematic judgment of the information in the light of the evaluation criteria previously set); 6. The existence of good communication and dissemination processes of the evaluation results and reports

 The logic of those evaluation questions and for judging the evaluation processes was built around six main criteria:

slide-12
SLIDE 12

Gender Responsive evaluation

It is necessary to distinguish between:

  • Evaluation of gender policies

As a policy tool, the evaluation might be especially fruitful for capturing the important changes and shifts on gender policies, for improving them, as well as for answering to what extent these policies are successful. As an integral part of the intervention, evaluation might guide developments, further needs and new areas for development

  • Evaluation from a gender perspective

As part of the policy making process, and following the aim of the gender mainstreaming strategy, evaluation is an important part to be conducted under a gender perspective, with a gender lens

slide-13
SLIDE 13

Gender Responsive (Meta)evaluation

How gender is included in the different evaluation phases?

slide-14
SLIDE 14

Twelve MEv criteria covering evaluations design – process - results- utilization, with associated dimensions and rubrics

Second example: MEv of 40 evaluations in Senegal

slide-15
SLIDE 15

Actual practice of MEv in aid development evaluation

slide-16
SLIDE 16

Evaluation standards, checklists and guidelines used in the MEv

slide-17
SLIDE 17
slide-18
SLIDE 18
slide-19
SLIDE 19
slide-20
SLIDE 20

Conclusions about the usefulness of MEv

  • MEv can be useful for the improvement and development of the evaluation

function in many settings, especially in settings with limited evaluation culture and low level of evaluation institutionalization;

  • The use of standards, guidelines and professional competencies of the

evaluation discipline can guide the critical reflection about a set of real-world evaluations, surpassing the narrow conception of evaluation quality;

  • The review of evaluation reports need to be complemented with interviews in
  • rder to grasp dimensions related to evaluation utilization and to better

understand the constraints of real-world evaluation processes (evaluation design vs. real delivery, responsiveness to information needs of different audiences, etc);

  • Following the trends of evaluation professionalization, research whose object
  • f study is the evaluation function can help to the improvement of its

usefulness to public policy making and development effectiveness.

slide-21
SLIDE 21

Some bibliographic resources

  • AEA. (2004). American Evaluation Association Guiding Principles for Evaluators. Fairhaven MA, USA:

American Evaluation Association. Retrieved from www.eval.org

  • AfrEA. (2007a). African Evaluation Guidelines - Standards and Norms. Niamey. www.afrea.org

Baba, T. (2007). Meta-evaluation report of research studies, evaluations and reviews conducted by the UNICEF Pacific Office during programme cycle 2003-2007. Suva. http://www.unicef.org/pacificislands/resources_9975.html Baslé, M. (2013). Méta-évaluation des politiques publiques et qualité des évaluations. In Séminaire du Réseau des chercheures en évaluation des politiques publiques de la Société Française d’Evaluation.

  • BMZ. (2009). Evaluation in German Development. A system’s review. Bonn. www.bmz.de

Bustelo, M. (2002). Metaevaluation as a tool for the improvement

  • f

the evaluation function in public administrations. In European Evaluation Society Conference (pp. 1–15). https://evaluationcanada.ca/distribution/20021010_bustelo_maria.pdf

  • CES. (2010). Competencies for Canadian Evaluation Practice. Canadian Evaluation Society. www.evaluationcanada.ca

DAC-OECD. (2010, November 3). DAC Quality Standards for Development Evaluation. http://doi.org/10.1787/9789264083905-en

  • DANIDA. (2004). Meta-Evaluation. Private and business sector development interventions.www.evaluation.dk

Eriksson, J. (2011). A Meta-Evaluation of USAID Foreign Assistance Evaluations. Washington DC.www.usaid.org

slide-22
SLIDE 22

Some bibliographic resources (II)

  • IDEAS. (2012). Competencies for Development Evaluation Evaluators, Managers, and Commissioners.

International Development Evaluation Association. www.ideas-int.org Ingram, G., Fostved, N., & Lele, U. (2003a). The CGIAR at 31 : An Independent Meta- Evaluation

  • f

the Consultative Group

  • n

International Agricultural Research. The CGIAR in Africa: Past, Present, and Future. W. www.cgiar.org Ingram, G., Fostved, N., & Lele, U. (2003b). The CGIAR at 31: An Independent Meta-Evaluation

  • f

the Consultative Group

  • n

International Agricultural Research. Vol 1:

  • verview

report. (Vol. 1). Washington DC. www.cgiar.org Lele, U., Barrett, C., Eicher, C. K., & Gardner, B. (2003). The CGIAR at 31: An Meta-Evaluation

  • f

the Consultative Group

  • n

International Agricultural Research Volume 3: Annexes. www.cgiar.org Lomeña-Gelis, M. (2015). A Meta-evaluation of Sustainable Land Management Initiatives in Senegal. PhD thesis on Sustainability. University Research Institute for Sustainability Science and Technology, Universitat Politecnica de Catalunya (Spain). https://upcommons.upc.edu/handle/2117/95787 Olsen, K., & O’Reilly, S. (2011). Evaluation Methodologies. A brief review

  • f

meta-evaluation, systematic review and synthesis evaluation methodologies and their applicability to complex evaluations within the context of international development. (Vol. 44).. www.iodparc.com

slide-23
SLIDE 23

Some bibliographic resources (IV)

Shah, F., & Patch, J. (2011). Meta-Review

  • f

AusAID Education Sector Evaluations, 2006-2011. http://auserf.com.au/wpcontent/files_mf/1367729332ERF10239_EducationMetaEvaluation.pdf Sheeran, A. (2008). UNICEF Child Protection Meta-Evaluation. Seattle. www.unicef Stufflebeam, D. L. (1999). Program Evaluations Metaevaluation checklist (based

  • n

the Program Evaluation Standards ). www.wmich.edu Stufflebeam, D. L. (2007). CIPP evaluation model checklist. Evaluation checklists project. www.wmich.edu/evalctr/checklists

  • UNEG. (2005). Norms for Evaluation in the UN System. New York: United Nations Evaluation Group. www.uneval.org
  • UNEG. (2010a). Quality Checklist for Evaluation Reports. New York: United Nations Evaluation Group. www.uneval.org

UNEG. (2010b). Quality Checklist for Evaluation Terms

  • f

Reference and Inception Reports. New York: United Nations Evaluation Group. www.uneval.org

  • UNIDO. (2010). Meta Evaluation UNIDO Integrated Programmes, 2007-2009. Vienna. www.unido.org
  • Universalia. (2003). Meta-Evaluation. An analysis of IUCN Evaluations. 2000-2002. www.iucn.org

Wingate, L. A. (2009). The Program Evaluation Standards applied for meta-evaluation purposes: investigating interrater reliability and implications for use. Western Michigan University. www.wmich.edu

slide-24
SLIDE 24

Some bibliographic resources (V)

World Bank. (2011). Writing terms

  • f

reference for an evaluation: a how-to guide (IEG Blue Booklet Series). Washington DC. www.worldbank.org Worlen, C. (2011). Meta-Evaluation of climate mitigation evaluations. Washington DC. www.thegef.org Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. A. (2011). The Programme Evaluation Standards. A guide for evaluators and evaluation users. Joint Committee

  • n

Standards for Educational Evaluation. Thousand Oaks: SAGE Publications. www.jcsee.org https://www.slideshare.net/DawitWolde/meta-evaluation-and-evaluation-disseminationmanual-3 https://www.slideshare.net/davidpassmore/metaevaluation-theory https://unesdoc.unesco.org/ark:/48223/pf0000247262 https://www.russellsage.org/publications/handbook-research-synthesis-and-meta-analysis-second-edition https://read.oecd-ilibrary.org/development/strengthening-accountability-in-aid-for-trade/the-oecd-meta-evaluation-overview-of- evaluations_9789264123212-8-en#page1

slide-25
SLIDE 25

To know more about synthesis and meta-analysis in development evaluation: https://www.evalforward.org/events/webinar-use-synthesis- and-meta-analysis-development-evaluation

slide-26
SLIDE 26

Thank you, merci, gracias!

Mónica Lomeña-Gelis, Principal Evaluation Officer Independent Development Evaluation (IDEV) African Development Bank Group m.lomena-gelis@afdb.org María Bustelo, Director Master on Evaluation of Programmes and Public Policies, Universidad Complutense de Madrid mbustelo@cps.ucm.es