Developing Evaluations That Are Used
Wellington | New Zealand February 17, 2017
Evaluations That Are Used Wellington | New Zealand February 17, - - PowerPoint PPT Presentation
Developing Evaluations That Are Used Wellington | New Zealand February 17, 2017 9540-145 Street Edmonton, Alberta, CA T5N 2W8 P: 780-451-8984 F: 780-447-4246 E: Mark@here2there.ca 2 A Little Context Agenda Six Conversations Getting
Wellington | New Zealand February 17, 2017
9540-145 Street Edmonton, Alberta, CA T5N 2W8 P: 780-451-8984 F: 780-447-4246 E: Mark@here2there.ca
2
Six Conversations
Conversation #1
What questions are you bringing to today’s session?
Opening Comments
1960s
policy”.
evaluation on program decisions has been “with few exceptions, nil”.
article, “Evaluation: Who Needs It? Who Cares?”
Th The 21st
st Ce
Century
made regular use of social science research and evaluation.
“Why Measure: Nonprofits use metrics to show that they are efficient, but what if donors don’t care?”
rigorous evaluation studies of the thirty four billion dollars spent on foreign assistance that year.
shape the decisions of their intended users.
When decisions are made using evaluative judgments, evaluation results are combined with other considerations to support decision making. Politics, values, competing priorities, the state of knowledge about a problem, the scope of the problem, the history of the program, the availability of resources, public support, and managerial competence all come into play in program and policy decision processes. Evaluation findings, if used at all, are usually one piece of the decision making pie, not the whole pie. Rhetoric about "data-based decision making" and "evidence-based practice" can give the impression that one simply looks at evaluation results and a straightforward decision follows.
Erase that image from your mind. That is seldom, if ever, the
case.
Instrumental Use: informs decisions Process Use: encourages evaluative thinking Conceptual Use: offers new insights Non-Use: findings are ignored Mis-Use: findings are selectively or improperly used
BAD! GOOD! In Pairs: Each share an example of each.
Conversation #2
Key Poin
formal evaluation is a living multi ti-disciplinary ry art & sci science.
no universal recipes for methods and indicators in evaluation: there are only steps, principles & standards and emerging practices that help create them.
social al innovators without experience, training and practice in evaluation are (very unlikely) to master evaluation design, as a primary user of evaluation, they must productively participate in all the key steps of an evaluation.
#1 We are all capable of evaluative thinki king, though formal evaluation is a living multi- dis iscip iplin linary ry art art & sci science.
In the beginning, God created the heaven and the earth. And God saw everything that he made, “Behold,” God said, “it is very good.” And the evening and the morning were the sixth day. And on the seventh day, God rested from all His work. His archangel came then unto Him asking, “God, how do you know that what you have created is “very good”? What are your criteria? On what data do you base your judgment? Just exactly what results were you expecting to attain? And, aren’t you a little close to the situation to make a fair and unbiased evaluation?” God thought about these questions all that day and His rest was greatly disturbed. On the eighth day God said, “Lucifer, go to hell.” Thus was evaluation born in a blaze of glory. From Halcolm’s The Real Story of Paradise Lost.
“Blink” (aka Rapid Cognition) “Think” (aka Evaluative Thinking)
Key Tasks
unit of analysis
questions
The Core Challenge
change?
efforts contributed to the change?
evaluation culture
The Evaluation Field
‘contested’ multi-disciplinary practice.
e.g. monitoring, developmental, accountability.
principles, standards and competencies with variations.
Social Innovators
evaluation for multiple purposes.
resources for evaluation.
unclear about what is reasonable to expect re: role and quality of evaluation.
credentialization).
Better Evaluation Website Utilization-Focused Book American Evaluation Society
Canadian
1. Utility – does the evaluation generate useful data for the users? 2. Feasibility – is the evaluation design efficient and do’able with the financial, technical and timing constraints? 3. Proprietary – is the evaluation done in a way that is proper, fair, legal, right and just? 4. Accuracy – to what extent are the evaluation representations, propositions, and findings, especially those that support interpretations and judgments about quality, truthful and dependable? 5. Accountable – to what extent are evaluators and evaluator users focusing on documenting and improving the evaluation processes?
Aotearoa New Zealand
relationships
and trustworthy results
usefulness.
#2
There are no no universal reci cipes for methods and indicators in evaluation: there are only steps, principles & standards and emerging practices that help create them.
Th The Challenge
Forget Standardized Recipes Think “Chopped Canada”
Examples
Toronto Region Immigrant Employment Council Native Counseling Services of Alberta The Energy Futures Lab & Circular Economy Lab
Th The Design Box
User Purpose, Questions, Preferences Time & Resource Constraints Evaluation Standards Evaluation Expertise & Experience
From One of the World’s Best
[Evaluation] isn’t some particular method of recipe-like steps to follow. It doesn’t offer a template of standard
how to brin ring data to bear r on what’s unfolding so as to guide and develop the
timing of the inquiry will depend on the situation, context, people involved, and the fundamental principle of doing what makes sense for program development.
Michael Quinn Patton. Developmental Evaluation. 2010: pp.75-6.
#
So Socia ial l inn innovators without experience, training and practice in evaluation are (very unlikely) to master evaluation design, but as a primary user of evaluation, they must productively participate in all the phases
An organization with a strong evaluative culture:
such as through monitoring and evaluation,
what it is doing, and
Phase Social Innovator Evaluator
SCOPING (e.g., confirming the intervention, clarifying purpose and audience, initial questions and constraints (timelines and budget). “Flush out” the initial elements of the scope of work. Work with social innovators to ‘flesh out’ the details. DESIGN (e.g. method, metrics, logistics) Provide input on the preferred methods and metrics; be willing to ‘test’ things; adjust well developed practices. Lead on all aspects of design, drawing on experience and adhering to principles and standards. IMPLEMENTATION (e.g., gathering, analyzing, summarizing data). Depends on the evaluation design. Depends on the evaluation design. SENSE-MAKING & USE (e.g., interpreting, drawing conclusions, making judgements). Commit to taking seriously and embedding in decision- making. Facilitating the process wherever and however possible.
Conversation #3
Small Group Act ctivity
How would you evaluate this micro-lending program in New York City?
Your initial response to the someone’s sweeping request to ‘evaluate’ something.
Utilization Focused Evaluation Design Thinking
(UFE) is evaluation done for and with specific intended primary users for specific, intended uses.
evaluations should be judged by their utility and actual use; therefore, evaluators should facilitate the evaluation process and design any evaluation with careful consideration for how everything that is done, from beginning to end, will affect use.
real world apply evaluation findings and experiences in the evaluation process. Michael Quinn Patton
Utilization-Focused
Michael Quinn Patton Essentials of Utilization- Focused Evaluation. 1st
Utilization-Focused Checklist (google it!). Michael Quinn Patton
‘users’
projects
Tim Brown IDEO
Stanford Design School http://dschool.stanford.edu/ IDEO (Design Thinking) http://designthinking.ideo.com/ Human Centred Design Institute http://hcdi.brunel.ac.uk/
Aide 1: Identify Primary Users
Challenge
‘stakeholders’ or ‘audiences’ for the evaluation results in too broad an assessment to be useful.
needs of primary users – those who that seek evaluative feedback to make decisions that will affect the intervention -- is more productive.
evaluation results to keen secondary users informed.
Aides
Identification Matrix
Primary User
Will (likely) use evaluation results to make a decision that will affect the intervention.
Secondary User
May use evaluation results but not in a way that influences the intervention
(Primary User) (Primary User) (Secondary User) (Secondary User)
‘users’ for your initiative.
Step 2: Develop User Profile/Cases
Challenge
evaluation design that will be useful for the primary user, the evaluator needs to be clear on what questions the user wants to explore and their preferences for how this is done.
Aides
Pair irs
Find a partner. Interview each other to complete a profile for at least one primary evaluation user in your work.
Step 3: Organize & Prioritize Use Cases
Challenge
diverge in their questions, uses and preferences.
identify areas where these converge and diverge in order to surface the implications for evaluation design.
Aides
What are patterns and differences across user profiles?
One design fits all A common platform with a number Several clusters of design. Each user requires unique design.
Developmental
Example: Community School Wrap Around Model
Accountability Stream: Three Current Funders
Formative Summative
What is our model? What parts common? Differ? How can improve the
How will we test/adapt the model in different contexts and scales?
Developmental
Stewardship Stream: Collaborative Leadership Team
Is this model worth scaling? If so, what is the demand, readiness, and support for expanding it?
TBD. The Delivery Stream: 12 Site-Based Partnerships
United Way Foundation City
Developmental/Formative TBD
What is our design and plan for this year? Next year? What can we improve or change? Are you doing what you said you were going to do? Are you making sufficient progress on issues and outcomes that we care about to continue financial support?
Step 4: Develop Scope of Work
Challenge
evaluation design work to come by capturing the proposed approach in a scope of work.
review, discuss, upgrade and sign off on the scope is an iterative process.
Aides
Statement of Work
Preparing an Evaluation Scope of Work: (USAID) Evaluation Brief: Government of New South Wales
Statement of Work: Better Evaluation
results program or strategy) to be evaluated
implementation
information sources
the evaluation
answer the questions
and participation of customers and partners
logistics
dissemination
SEE USAID: Preparing an Evaluation Scope of Work
Step 5: Test Evaluation Design
Challenge
prototypes of evaluation process and findings before more fulsome implementation of the design.
important issues in primary users commitment to findings, or design challenges in the evaluation design.
Aides
throughs/Storyboard
Simulation/Rehearsal
What is a prototype?
sample or model built to test a concept or process or to act as a thing to be replicated
Why prototype?
Outcomes
work, user commitment to using evaluation findings.
the original idea and/or design.
Walk-through/Story Board Data Simulation Pre-Mortem
NESTA (UK Innovation Charity) http://www.nesta.org.uk/ Human Centred Design http://designthinking.ideo.com/ Service Design Tools http://www.servicedesigntools.org/
Share an example of when you feel your
have used prototyping to develop an evaluation process. What evaluation method (or part of the evaluation process) would you like to prototype when you return home?
Evaluation Resources
Better Evaluation.org Utilization-Focused Evaluation (Michael Quinn Patton) Evaluation Associations
Conversation #5
Examples
Toronto Region Immigrant Employment Council Native Counseling Services of Alberta The Energy Futures Lab & Circular Economy Lab
Th The Design Box
User Profiles (e.g., purpose, questions, preferences) Time & Resource Constraints Evaluation Standards Evaluation Expertise & Experience
#
So Socia ial l inn innovators without experience, training and practice in evaluation are (very unlikely) to master evaluation design, but as a primary user of evaluation, they must productively participate in all the phases
An organization with a strong evaluative culture:
such as through monitoring and evaluation,
what it is doing, and
Phase Social Innovator Evaluator
SCOPING (e.g., confirming the intervention, clarifying purpose and audience, initial questions and constraints (timelines and budget). “Flush out” the initial elements of the scope of work. Work with social innovators to ‘flesh out’ the details. DESIGN (e.g. method, metrics, logistics) Provide input on the preferred methods and metrics; be willing to ‘test’ things; adjust well developed practices. Lead on all aspects of design, drawing on experience and adhering to principles and standards. IMPLEMENTATION (e.g., gathering, analyzing, summarizing data). Depends on the evaluation design. Depends on the evaluation design. SENSE-MAKING & USE (e.g., interpreting, drawing conclusions, making judgements). Commit to taking seriously and embedding in decision- making. Facilitating the process wherever and however possible.
Conversation #5
challenge.
challenge and question:
to share reflections on consultants’ discussion.
Getting [people] to taste evaluation may, indeed, be a long shot in many situations. Many people have been made sick by past concoctions called evaluation. Each evaluation being a blend of unique ingredients, no standardized recipe can ensure the outcome. We have only principles, premises and utilization focused processes to guide us, and we have much to learn. But the potential benefits merit the efforts and risks involved. At stake is the vision of an Experimenting Society. The only way to find out is to try it – and evaluate the results.
Michael Quinn Patton. Utilization Focused Evaluation Page 385
for using a design approach to developing evaluations?
you about how you might do employ this approach in your work?
emerge from this discussion for your own evaluation work?
1 2 3 4 5 Hate it Don’t Like It Unsure Like it. Love it.