Joint Scandinavian Evaluation of support to Capacity Development - - PowerPoint PPT Presentation

joint scandinavian evaluation of support to capacity
SMART_READER_LITE
LIVE PREVIEW

Joint Scandinavian Evaluation of support to Capacity Development - - PowerPoint PPT Presentation

Joint Scandinavian Evaluation of support to Capacity Development yvind Eggen Peter Jul Larssen Lennart Peck Scandinavian Joint Mission Utvrderingens design och tidsplan Initial Joint Planning 2013 Danida Norad Sida Pre-study


slide-1
SLIDE 1

Joint Scandinavian Evaluation

  • f support to

Capacity Development

Øyvind Eggen Peter Jul Larssen Lennart Peck

slide-2
SLIDE 2

Scandinavian Joint Mission

slide-3
SLIDE 3

Utvärderingens design och tidsplan

Pre-study Pre-study Pre-study

ToR

Main evaluation Main evaluation Main evaluation

Synthesis report Dissemination Initial Joint Planning

Danida Norad Sida

2013 2014 2015

slide-4
SLIDE 4

Organisation

  • A simple MoU
  • Each agency responsible for its own parts
  • No mixing of funds
  • A joint ”management group”
  • Series of meetings at crucial points
  • External Advisor on CD
slide-5
SLIDE 5

As compared to single donor evaluations

+ Stronger evidence base + Efficiency gains through division of labour + Additional perspectives – richer process + Increased attention and status within agencies + Relations between evaluation units strenghened

  • Requires compromise
  • More coordination
slide-6
SLIDE 6

As compared to ”traditional” joint evaluation

+ A minimum of formalities + Work and resposibilities more equally divided + Flexibility to include agency specific concerns + 3 stand-alone reports of relevance for each agency

  • Not full use of the ”scale”
  • Less admin in some respects – more in other
  • Potentially difficult to syntesize
slide-7
SLIDE 7

Reflections

slide-8
SLIDE 8

Joint Scandinavian Evaluation

  • f support to

Capacity Development

Conceptual approach and methodology

Øyvind Eggen Peter Jul Larsen Lennart Peck

slide-9
SLIDE 9

Capacity is understood as the ability of people, organizations and society as a whole to manage their affairs successfully. ‘Capacity development’ is understood as the process whereby people, organizations and society as a whole unleash, strengthen, create, adapt and maintain capacity over time.

DAC (2006). The Challenge of Capacity Development: Working Towards Good Practice. Paris, OECD.

slide-10
SLIDE 10
slide-11
SLIDE 11

Key questions

  • Are we talking about the same when we talk

about support to capacity development?

  • How to select cases/samples from an enormous

heterogeneous universe with porous boundaries (Norway: 1625 database entries)?

  • How to compare across institutions?
  • Is it at all possible to measure effectiveness?

11

slide-12
SLIDE 12

Pre-studies

  • Literature review (Sida)
  • Methodology study (Norad)
  • Conceptual framework (Danida)
  • Concept paper
  • Specification of methodology
  • Joint ToR

Available at www.sida.se/joint-cd-evaluation

12

slide-13
SLIDE 13

The Theory of Change hypothesis

  • Donor support to capacity development is (more)

effective when it

  • i. fits the drivers for and constraints to change (“builds
  • n what is there”)
  • ii. donor support is accepted as legitimate and

appropriate;

  • iii. uses results sensibly to measure progress, correct

course and learn; and

  • iv. looks beyond “supply-side”, to broader accountability

relations

13

slide-14
SLIDE 14

Selection

  • Limitation to public sector
  • Joint selection criteria, different processes
  • Three sets of samples:

1.Portfolio screening of 100 interventions (Danida 30, Sida 29, Norad 41). 2.Desk based reviews of 46 interventions (Danida 15, Sida 14, Norad 17) 3.Country visits to nine countries (Danida: Uganda, Tanzania and Nepal; Norad: Malawi, Mozambique, Vietnam; Sida: Kenya, Cambodia, Bosnia-Hercegovina)

14

slide-15
SLIDE 15

Data collection

  • ‘Score cards’: approx. 20+ standardised

variables (quantitative + qualitative).

  • Desk based reviews: documents + interviews
  • In-depth studies during country visits
slide-16
SLIDE 16

Lessons

  • Joint conceptual and analytical framework
  • a challenge with different evaluation teams
  • Standardized variables across interventions and

teams – could it be done?

  • Desk based reviews need more time and efforts
  • How to make it possible and meaningful to compare

and synthesize experiences from CD across institutions?

16