Evaluations That Are Used Wellington | New Zealand February 17, - - PowerPoint PPT Presentation

evaluations that
SMART_READER_LITE
LIVE PREVIEW

Evaluations That Are Used Wellington | New Zealand February 17, - - PowerPoint PPT Presentation

Developing Evaluations That Are Used Wellington | New Zealand February 17, 2017 9540-145 Street Edmonton, Alberta, CA T5N 2W8 P: 780-451-8984 F: 780-447-4246 E: Mark@here2there.ca 2 A Little Context Agenda Six Conversations Getting


slide-1
SLIDE 1

Developing Evaluations That Are Used

Wellington | New Zealand February 17, 2017

slide-2
SLIDE 2

9540-145 Street Edmonton, Alberta, CA T5N 2W8 P: 780-451-8984 F: 780-447-4246 E: Mark@here2there.ca

2

slide-3
SLIDE 3

A Little Context

slide-4
SLIDE 4

Agenda

Six Conversations

  • Getting Grounded
  • Evaluation 101
  • User Oriented Design
  • The Design Box
  • Troika Consulting
  • Emerging Issues
slide-5
SLIDE 5

Getting Grounded

Conversation #1

slide-6
SLIDE 6

What questions are you bringing to today’s session?

TRIADS

slide-7
SLIDE 7

The Challenge of Use

Opening Comments

slide-8
SLIDE 8

1960s

  • 1969: “A dearth of examples of where evaluation informed

policy”.

  • 1970: “ A general failure” to use evaluation”.
  • 1975: “The influence of social science research and

evaluation on program decisions has been “with few exceptions, nil”.

  • 1976: A celebrated educational researcher published an

article, “Evaluation: Who Needs It? Who Cares?”

slide-9
SLIDE 9

Th The 21st

st Ce

Century

  • 2003. Quebec researchers found few government officials

made regular use of social science research and evaluation.

  • 2004. Two senior administrators and policy advisors write

“Why Measure: Nonprofits use metrics to show that they are efficient, but what if donors don’t care?”

  • 2005. International Evaluation Gap Working Group find few

rigorous evaluation studies of the thirty four billion dollars spent on foreign assistance that year.

  • 2006. Researchers estimate ten percent of evaluations helped

shape the decisions of their intended users.

slide-10
SLIDE 10

When decisions are made using evaluative judgments, evaluation results are combined with other considerations to support decision making. Politics, values, competing priorities, the state of knowledge about a problem, the scope of the problem, the history of the program, the availability of resources, public support, and managerial competence all come into play in program and policy decision processes. Evaluation findings, if used at all, are usually one piece of the decision making pie, not the whole pie. Rhetoric about "data-based decision making" and "evidence-based practice" can give the impression that one simply looks at evaluation results and a straightforward decision follows.

Erase that image from your mind. That is seldom, if ever, the

case.

slide-11
SLIDE 11

Five Types of Use

Instrumental Use: informs decisions Process Use: encourages evaluative thinking Conceptual Use: offers new insights Non-Use: findings are ignored Mis-Use: findings are selectively or improperly used

BAD! GOOD! In Pairs: Each share an example of each.

slide-12
SLIDE 12

Evaluation 101

Conversation #2

slide-13
SLIDE 13

Key Poin

  • ints
  • 1. We are all capable of evaluative thinking, though

formal evaluation is a living multi ti-disciplinary ry art & sci science.

  • 2. There are no

no universal recipes for methods and indicators in evaluation: there are only steps, principles & standards and emerging practices that help create them.

  • 3. Though

social al innovators without experience, training and practice in evaluation are (very unlikely) to master evaluation design, as a primary user of evaluation, they must productively participate in all the key steps of an evaluation.

slide-14
SLIDE 14

#1 We are all capable of evaluative thinki king, though formal evaluation is a living multi- dis iscip iplin linary ry art art & sci science.

slide-15
SLIDE 15

The Genesis

  • f Evaluation

In the beginning, God created the heaven and the earth. And God saw everything that he made, “Behold,” God said, “it is very good.” And the evening and the morning were the sixth day. And on the seventh day, God rested from all His work. His archangel came then unto Him asking, “God, how do you know that what you have created is “very good”? What are your criteria? On what data do you base your judgment? Just exactly what results were you expecting to attain? And, aren’t you a little close to the situation to make a fair and unbiased evaluation?” God thought about these questions all that day and His rest was greatly disturbed. On the eighth day God said, “Lucifer, go to hell.” Thus was evaluation born in a blaze of glory. From Halcolm’s The Real Story of Paradise Lost.

slide-16
SLIDE 16

Blin link Versus Thin ink

“Blink” (aka Rapid Cognition) “Think” (aka Evaluative Thinking)

slide-17
SLIDE 17

Some Im Important Parts of Evalu luative Thin inking

Key Tasks

  • 1. Clarifying intervention &

unit of analysis

  • 2. Developing purpose &

questions

  • 3. Developing a design
  • 4. Gathering, analyzing data
  • 5. Interpreting data
  • 6. Drawing conclusions
  • 7. Making recommendations
  • 8. Informing decisions

The Core Challenge

  • How will we ‘measure’

change?

  • How will we know if our

efforts contributed to the change?

  • How will we ‘judge’ the
  • utcomes?
slide-18
SLIDE 18

A Brie ief His istory of Evalu luation

  • First 12,000 Years: Trial & Error (e.g. brewing beer & making bread)
  • Renaissance: empiricism (e.g. sun around earth or vice versa)
  • Enlightenment: experimentalism (e.g. scurvy).
  • Industrial: measurement (e.g., factories) & action learning (e.g. gangs)
  • Early 20th c: program evaluation (e.g., agriculture, education, propaganda)
  • 1960s: centralized experimentalism 2.0 (e.g., War on Poverty)
  • 1970s: diversity of approaches & paradigms; low levels of use
  • 1980s: shift to accountability purpose, downloading of responsibility
  • 1990s: bottom up efforts (e.g. logic models), business ideas (e.g., social return
  • n investment) and new managerialism
  • 2000s: renewed emphasis on complexity, social innovation, learning and

evaluation culture

  • 2010s: sophisticated field, culture wars, unclear expectations
slide-19
SLIDE 19

Evalu luation Today

The Evaluation Field

  • An evolving, diverse and

‘contested’ multi-disciplinary practice.

  • Focus on (at least) six purposes:

e.g. monitoring, developmental, accountability.

  • Professionalization of the field.
  • Increasingly agreed upon

principles, standards and competencies with variations.

Social Innovators

  • Desire (and pressure) to use

evaluation for multiple purposes.

  • Limited in-house expertise and/or

resources for evaluation.

  • Funders and social innovators

unclear about what is reasonable to expect re: role and quality of evaluation.

  • Low level of evaluation use (aka

credentialization).

slide-20
SLIDE 20

Evaluation Purposes

slide-21
SLIDE 21

There are steps!

Better Evaluation Website Utilization-Focused Book American Evaluation Society

slide-22
SLIDE 22

Evaluation Standards

Canadian

1. Utility – does the evaluation generate useful data for the users? 2. Feasibility – is the evaluation design efficient and do’able with the financial, technical and timing constraints? 3. Proprietary – is the evaluation done in a way that is proper, fair, legal, right and just? 4. Accuracy – to what extent are the evaluation representations, propositions, and findings, especially those that support interpretations and judgments about quality, truthful and dependable? 5. Accountable – to what extent are evaluators and evaluator users focusing on documenting and improving the evaluation processes?

Aotearoa New Zealand

  • Respectful, meaningful

relationships

  • Ethic of care
  • Responsive methodologies

and trustworthy results

  • Competence and

usefulness.

slide-23
SLIDE 23

Evaluator Competencies

slide-24
SLIDE 24

#2

There are no no universal reci cipes for methods and indicators in evaluation: there are only steps, principles & standards and emerging practices that help create them.

slide-25
SLIDE 25

Th The Challenge

Forget Standardized Recipes Think “Chopped Canada”

slide-26
SLIDE 26

Examples

Toronto Region Immigrant Employment Council Native Counseling Services of Alberta The Energy Futures Lab & Circular Economy Lab

slide-27
SLIDE 27

Th The Design Box

User Purpose, Questions, Preferences Time & Resource Constraints Evaluation Standards Evaluation Expertise & Experience

slide-28
SLIDE 28

From One of the World’s Best

[Evaluation] isn’t some particular method of recipe-like steps to follow. It doesn’t offer a template of standard

  • questions. It’s a mindset of inquiry into

how to brin ring data to bear r on what’s unfolding so as to guide and develop the

  • unfolding. What that means and the

timing of the inquiry will depend on the situation, context, people involved, and the fundamental principle of doing what makes sense for program development.

Michael Quinn Patton. Developmental Evaluation. 2010: pp.75-6.

slide-29
SLIDE 29

#

So Socia ial l inn innovators without experience, training and practice in evaluation are (very unlikely) to master evaluation design, but as a primary user of evaluation, they must productively participate in all the phases

  • f an evaluation.
slide-30
SLIDE 30

An Evaluative Culture

An organization with a strong evaluative culture:

  • engages in self-reflection and self-examination:
  • deliberately seeks evidence on what it is achieving,

such as through monitoring and evaluation,

  • uses results information to challenge and support

what it is doing, and

  • values candor, challenge and genuine dialogue;
  • embraces evidence-based learning:
  • makes time to learn in a structured fashion,
  • learns from mistakes and weak performance, and
  • encourages knowledge sharing;
  • encourages experimentation and change:
  • supports deliberate risk taking, and
  • seeks out new ways of doing business
slide-31
SLIDE 31

Appropriate Role les for Evalu luation Desig ign

Phase Social Innovator Evaluator

SCOPING (e.g., confirming the intervention, clarifying purpose and audience, initial questions and constraints (timelines and budget). “Flush out” the initial elements of the scope of work. Work with social innovators to ‘flesh out’ the details. DESIGN (e.g. method, metrics, logistics) Provide input on the preferred methods and metrics; be willing to ‘test’ things; adjust well developed practices. Lead on all aspects of design, drawing on experience and adhering to principles and standards. IMPLEMENTATION (e.g., gathering, analyzing, summarizing data). Depends on the evaluation design. Depends on the evaluation design. SENSE-MAKING & USE (e.g., interpreting, drawing conclusions, making judgements). Commit to taking seriously and embedding in decision- making. Facilitating the process wherever and however possible.

slide-32
SLIDE 32

Evaluation Purposes

slide-33
SLIDE 33

User Oriented Design

Conversation #3

slide-34
SLIDE 34

Small Group Act ctivity

How would you evaluate this micro-lending program in New York City?

slide-35
SLIDE 35

Your initial response to the someone’s sweeping request to ‘evaluate’ something.

slide-36
SLIDE 36

The Id Idea

Utilization Focused Evaluation Design Thinking

A User Oriented Approach to Designing Evaluations

slide-37
SLIDE 37
  • UTILIZATION-FOCUSED EVALUATION

(UFE) is evaluation done for and with specific intended primary users for specific, intended uses.

  • UFE begins with the premise that

evaluations should be judged by their utility and actual use; therefore, evaluators should facilitate the evaluation process and design any evaluation with careful consideration for how everything that is done, from beginning to end, will affect use.

  • Use concerns how real people in the

real world apply evaluation findings and experiences in the evaluation process. Michael Quinn Patton

slide-38
SLIDE 38

UFE FE Resources

Utilization-Focused

  • Evaluation. 4th Edition.

Michael Quinn Patton Essentials of Utilization- Focused Evaluation. 1st

  • Edition. Michael Quinn Patton

Utilization-Focused Checklist (google it!). Michael Quinn Patton

slide-39
SLIDE 39

Simple Rules of f Design

  • 1. Organize design around

‘users’

  • 2. Be multi-disciplinary
  • 3. Try early and often
  • 4. Seek outside help
  • 5. Blend big and small

projects

  • 6. Anticipate evolution

Tim Brown IDEO

slide-40
SLIDE 40

Sample Design Process

slide-41
SLIDE 41

Design Resources

Stanford Design School http://dschool.stanford.edu/ IDEO (Design Thinking) http://designthinking.ideo.com/ Human Centred Design Institute http://hcdi.brunel.ac.uk/

slide-42
SLIDE 42

Steps & Tools

slide-43
SLIDE 43
slide-44
SLIDE 44

Aide 1: Identify Primary Users

slide-45
SLIDE 45

Challenge & Response

Challenge

  • The traditional practice of identifying

‘stakeholders’ or ‘audiences’ for the evaluation results in too broad an assessment to be useful.

  • Organizing evaluations around the

needs of primary users – those who that seek evaluative feedback to make decisions that will affect the intervention -- is more productive.

  • Evaluators can then “repackage”

evaluation results to keen secondary users informed.

Aides

  • User

Identification Matrix

slide-46
SLIDE 46

Primary User

Will (likely) use evaluation results to make a decision that will affect the intervention.

Secondary User

May use evaluation results but not in a way that influences the intervention

Users

slide-47
SLIDE 47

(Primary User) (Primary User) (Secondary User) (Secondary User)

slide-48
SLIDE 48

Pair irs

  • 1. Brainstorm 2-4 possible evaluation

‘users’ for your initiative.

  • 2. Map them against the matrix.
slide-49
SLIDE 49

Step 2: Develop User Profile/Cases

slide-50
SLIDE 50

Challenge & Aides

Challenge

  • In order to develop an

evaluation design that will be useful for the primary user, the evaluator needs to be clear on what questions the user wants to explore and their preferences for how this is done.

Aides

  • User Profile Worksheet
  • User Profile Card
  • Use Cases
slide-51
SLIDE 51
slide-52
SLIDE 52
slide-53
SLIDE 53
slide-54
SLIDE 54

Pair irs

Find a partner. Interview each other to complete a profile for at least one primary evaluation user in your work.

slide-55
SLIDE 55

Step 3: Organize & Prioritize Use Cases

slide-56
SLIDE 56

Challenge & Aides

Challenge

  • Users will typically

diverge in their questions, uses and preferences.

  • Evaluators need to

identify areas where these converge and diverge in order to surface the implications for evaluation design.

Aides

  • User Profile Worksheet
  • User Profile Card
  • Use Cases
slide-57
SLIDE 57

What are patterns and differences across user profiles?

slide-58
SLIDE 58

Possible Outcomes

One design fits all A common platform with a number Several clusters of design. Each user requires unique design.

slide-59
SLIDE 59

Developmental

Example: Community School Wrap Around Model

Accountability Stream: Three Current Funders

Formative Summative

What is our model? What parts common? Differ? How can improve the

  • verall model?

How will we test/adapt the model in different contexts and scales?

Developmental

Stewardship Stream: Collaborative Leadership Team

Is this model worth scaling? If so, what is the demand, readiness, and support for expanding it?

TBD. The Delivery Stream: 12 Site-Based Partnerships

United Way Foundation City

Developmental/Formative TBD

What is our design and plan for this year? Next year? What can we improve or change? Are you doing what you said you were going to do? Are you making sufficient progress on issues and outcomes that we care about to continue financial support?

slide-60
SLIDE 60

Step 4: Develop Scope of Work

slide-61
SLIDE 61

Challenge & Response

Challenge

  • Evaluators can focus the

evaluation design work to come by capturing the proposed approach in a scope of work.

  • Getting all three to

review, discuss, upgrade and sign off on the scope is an iterative process.

Aides

  • Evaluation Brief
  • Evaluation Scope
  • f Work
  • Evaluation

Statement of Work

slide-62
SLIDE 62

Resources

Preparing an Evaluation Scope of Work: (USAID) Evaluation Brief: Government of New South Wales

Statement of Work: Better Evaluation

slide-63
SLIDE 63

An Evalu luation Scope of Work

  • identifies the intervention (i.e. activity,

results program or strategy) to be evaluated

  • provides a brief background on

implementation

  • identifies existing performance

information sources

  • states the purpose, audience and use of

the evaluation

  • clarifies the evaluation questions
  • identifies the evaluation method to

answer the questions

  • discusses evaluation team composition

and participation of customers and partners

  • covers procedures such as schedule and

logistics

  • clarifies requirements for reporting and

dissemination

  • includes a budget

SEE USAID: Preparing an Evaluation Scope of Work

slide-64
SLIDE 64

Step 5: Test Evaluation Design

slide-65
SLIDE 65

Challe llenge & Aid ides

Challenge

  • Its important develop and test

prototypes of evaluation process and findings before more fulsome implementation of the design.

  • These small tests may reveal

important issues in primary users commitment to findings, or design challenges in the evaluation design.

Aides

  • Walk

throughs/Storyboard

  • Data

Simulation/Rehearsal

  • Pre-Mortem
slide-66
SLIDE 66

Prototype

What is a prototype?

  • A prototype is an early

sample or model built to test a concept or process or to act as a thing to be replicated

  • r learned from.

Why prototype?

  • Inexpensive
  • Fast
  • Low Risk
  • Learning Rich

Outcomes

  • Insight into what may or may not

work, user commitment to using evaluation findings.

  • Decision to drop, upgrade, or pivot on

the original idea and/or design.

slide-67
SLIDE 67

Evaluation Prototype Techniques

Walk-through/Story Board Data Simulation Pre-Mortem

slide-68
SLIDE 68

Example: Social Return on Investment for Gang Prevention Project

slide-69
SLIDE 69

Prototyping Resources

NESTA (UK Innovation Charity) http://www.nesta.org.uk/ Human Centred Design http://designthinking.ideo.com/ Service Design Tools http://www.servicedesigntools.org/

slide-70
SLIDE 70

Share an example of when you feel your

  • rganization ‘should’

have used prototyping to develop an evaluation process. What evaluation method (or part of the evaluation process) would you like to prototype when you return home?

slide-71
SLIDE 71

Evaluation Resources

Better Evaluation.org Utilization-Focused Evaluation (Michael Quinn Patton) Evaluation Associations

slide-72
SLIDE 72

The Design Box

Conversation #5

slide-73
SLIDE 73

Examples

Toronto Region Immigrant Employment Council Native Counseling Services of Alberta The Energy Futures Lab & Circular Economy Lab

slide-74
SLIDE 74

Th The Design Box

User Profiles (e.g., purpose, questions, preferences) Time & Resource Constraints Evaluation Standards Evaluation Expertise & Experience

slide-75
SLIDE 75

#

So Socia ial l inn innovators without experience, training and practice in evaluation are (very unlikely) to master evaluation design, but as a primary user of evaluation, they must productively participate in all the phases

  • f an evaluation.
slide-76
SLIDE 76

An Evaluative Culture

An organization with a strong evaluative culture:

  • engages in self-reflection and self-examination:
  • deliberately seeks evidence on what it is achieving,

such as through monitoring and evaluation,

  • uses results information to challenge and support

what it is doing, and

  • values candor, challenge and genuine dialogue;
  • embraces evidence-based learning:
  • makes time to learn in a structured fashion,
  • learns from mistakes and weak performance, and
  • encourages knowledge sharing;
  • encourages experimentation and change:
  • supports deliberate risk taking, and
  • seeks out new ways of doing business
slide-77
SLIDE 77

Appropriate Role les for Evalu luation Desig ign

Phase Social Innovator Evaluator

SCOPING (e.g., confirming the intervention, clarifying purpose and audience, initial questions and constraints (timelines and budget). “Flush out” the initial elements of the scope of work. Work with social innovators to ‘flesh out’ the details. DESIGN (e.g. method, metrics, logistics) Provide input on the preferred methods and metrics; be willing to ‘test’ things; adjust well developed practices. Lead on all aspects of design, drawing on experience and adhering to principles and standards. IMPLEMENTATION (e.g., gathering, analyzing, summarizing data). Depends on the evaluation design. Depends on the evaluation design. SENSE-MAKING & USE (e.g., interpreting, drawing conclusions, making judgements). Commit to taking seriously and embedding in decision- making. Facilitating the process wherever and however possible.

slide-78
SLIDE 78

Troika Consulting

Conversation #5

slide-79
SLIDE 79

Bring your challenges.

slide-80
SLIDE 80

Troika Consulting

  • 1. The client has 3 minutes to share their evaluation

challenge.

  • 2. The group has 5 minutes to ask questions about the

challenge and question:

  • 3. The client ‘turns his/her back’ and listens (no speaking!).
  • 4. The group has 5 minutes to discuss the client’s question.
  • 5. The client turns back to the consultants and has 2 minutes

to share reflections on consultants’ discussion.

slide-81
SLIDE 81

Getting [people] to taste evaluation may, indeed, be a long shot in many situations. Many people have been made sick by past concoctions called evaluation. Each evaluation being a blend of unique ingredients, no standardized recipe can ensure the outcome. We have only principles, premises and utilization focused processes to guide us, and we have much to learn. But the potential benefits merit the efforts and risks involved. At stake is the vision of an Experimenting Society. The only way to find out is to try it – and evaluate the results.

Michael Quinn Patton. Utilization Focused Evaluation Page 385

slide-82
SLIDE 82
  • 1. What is your level of support

for using a design approach to developing evaluations?

  • 2. What questions emerge for

you about how you might do employ this approach in your work?

  • 3. What is one implication to

emerge from this discussion for your own evaluation work?

1 2 3 4 5 Hate it Don’t Like It Unsure Like it. Love it.