evaluations that
play

Evaluations That Are Used Wellington | New Zealand February 17, - PowerPoint PPT Presentation

Developing Evaluations That Are Used Wellington | New Zealand February 17, 2017 9540-145 Street Edmonton, Alberta, CA T5N 2W8 P: 780-451-8984 F: 780-447-4246 E: Mark@here2there.ca 2 A Little Context Agenda Six Conversations Getting


  1. Developing Evaluations That Are Used Wellington | New Zealand February 17, 2017

  2. 9540-145 Street Edmonton, Alberta, CA T5N 2W8 P: 780-451-8984 F: 780-447-4246 E: Mark@here2there.ca 2

  3. A Little Context

  4. Agenda Six Conversations • Getting Grounded • Evaluation 101 • User Oriented Design • The Design Box • Troika Consulting • Emerging Issues

  5. Getting Grounded Conversation #1

  6. TRIADS What questions are you bringing to today’s session?

  7. The Challenge of Use Opening Comments

  8. 1960s • 1969: “A dearth of examples of where evaluation informed policy”. • 1970: “ A general failure” to use evaluation”. • 1975: “The influence of social science research and evaluation on program decisions has been “with few exceptions, nil”. • 1976: A celebrated educational researcher published an article, “Evaluation: Who Needs It? Who Cares?”

  9. st Ce The 21 st Th Century • 2003. Quebec researchers found few government officials made regular use of social science research and evaluation. • 2004. Two senior administrators and policy advisors write “Why Measure: Nonprofits use metrics to show that they are efficient, but what if donors don’t care?” • 2005. International Evaluation Gap Working Group find few rigorous evaluation studies of the thirty four billion dollars spent on foreign assistance that year. • 2006. Researchers estimate ten percent of evaluations helped shape the decisions of their intended users.

  10. When decisions are made using evaluative judgments, evaluation results are combined with other considerations to support decision making. Politics, values, competing priorities, the state of knowledge about a problem, the scope of the problem, the history of the program, the availability of resources, public support, and managerial competence all come into play in program and policy decision processes. Evaluation findings, if used at all, are usually one piece of the decision making pie, not the whole pie. Rhetoric about "data-based decision making" and "evidence-based practice" can give the impression that one simply looks at evaluation results and a straightforward decision follows. Erase that image from your mind . That is seldom, if ever, the case.

  11. Five Types of Use GOOD! Instrumental Use: informs decisions Process Use: encourages evaluative thinking Conceptual Use: offers new insights Non-Use: findings are ignored Mis-Use: findings are selectively or improperly used BAD! In Pairs: Each share an example of each.

  12. Evaluation 101 Conversation #2

  13. Key Poin oints 1. We are all capable of evaluative thinking, though formal evaluation is a living multi ti-disciplinary ry art & sci science. 2. There are no no universal recipes for methods and indicators in evaluation: there are only steps, principles & standards and emerging practices that help create them. 3. Though social al innovators without experience, training and practice in evaluation are (very unlikely) to master evaluation design, as a primary user of evaluation, they must productively participate in all the key steps of an evaluation.

  14. #1 We are all capable of evaluative thinki king, though formal evaluation is a living multi- dis iscip iplin linary ry art art & sci science.

  15. The Genesis of Evaluation In the beginning, God created the heaven and the earth. And God saw everything that he made, “Behold,” God said, “it is very good.” And the evening and the morning were the sixth day. And on the seventh day, God rested from all His work. His archangel came then unto Him asking, “God, how do you know that what you have created is “very good”? What are your criteria? On what data do you base your judgment? Just exactly what results were you expecting to attain? And, aren’t you a little close to the situation to make a fair and unbiased evaluation?” God thought about these questions all that day and His rest was greatly disturbed. On the eighth day God said, “Lucifer, go to hell.” Thus was evaluation born in a blaze of glory. From Halcolm’s The Real Story of Paradise Lost.

  16. Blin link Versus Thin ink “Think” “Blink” (aka Evaluative (aka Rapid Thinking) Cognition)

  17. Some Im Important Parts of Evalu luative Thin inking Key Tasks 1. Clarifying intervention & The Core Challenge unit of analysis • How will we ‘measure’ 2. Developing purpose & questions change? • How will we know if our 3. Developing a design efforts contributed to the 4. Gathering, analyzing data change? 5. Interpreting data • How will we ‘judge’ the outcomes? 6. Drawing conclusions 7. Making recommendations 8. Informing decisions

  18. A Brie ief His istory of Evalu luation • First 12,000 Years: Trial & Error (e.g. brewing beer & making bread) • Renaissance: empiricism (e.g. sun around earth or vice versa) • Enlightenment: experimentalism (e.g. scurvy). • Industrial: measurement (e.g., factories) & action learning (e.g. gangs) • Early 20 th c: program evaluation (e.g., agriculture, education, propaganda) • 1960s: centralized experimentalism 2.0 (e.g., War on Poverty) • 1970s: diversity of approaches & paradigms; low levels of use • 1980s: shift to accountability purpose, downloading of responsibility • 1990s: bottom up efforts (e.g. logic models), business ideas (e.g., social return on investment) and new managerialism • 2000s: renewed emphasis on complexity, social innovation, learning and evaluation culture • 2010s: sophisticated field, culture wars, unclear expectations

  19. Evalu luation Today The Evaluation Field Social Innovators • An evolving, diverse and • Desire (and pressure) to use ‘contested’ multi -disciplinary evaluation for multiple purposes. practice. • Limited in-house expertise and/or • Focus on (at least) six purposes: resources for evaluation. e.g. monitoring, developmental, • Funders and social innovators accountability. unclear about what is reasonable • Professionalization of the field. to expect re: role and quality of evaluation. • Increasingly agreed upon • Low level of evaluation use (aka principles, standards and competencies with variations. credentialization).

  20. Evaluation Purposes

  21. There are steps! Better Evaluation American Evaluation Utilization-Focused Book Website Society

  22. Evaluation Standards Canadian Aotearoa New Zealand 1. Utility – does the evaluation generate useful data for the users? • Respectful, meaningful 2. Feasibility – is the evaluation design relationships efficient and do’able with the financial, technical and timing constraints? • Ethic of care 3. Proprietary – is the evaluation done in a way that is proper, fair, legal, right and just? • Responsive methodologies 4. Accuracy – to what extent are the and trustworthy results evaluation representations, propositions, and findings, especially those that support • Competence and interpretations and judgments about quality, truthful and dependable? usefulness. 5. Accountable – to what extent are evaluators and evaluator users focusing on documenting and improving the evaluation processes?

  23. Evaluator Competencies

  24. #2 There are no no universal reci cipes for methods and indicators in evaluation: there are only steps, principles & standards and emerging practices that help create them.

  25. Th The Challenge Forget Standardized Recipes Think “Chopped Canada”

  26. Examples Toronto Native The Energy Region Counseling Futures Lab Immigrant Services of & Circular Employment Alberta Economy Lab Council

  27. Th The Design Box User Purpose, Time & Resource Questions, Constraints Preferences Evaluation Evaluation Expertise & Experience Standards

  28. From One of the World’s Best [Evaluation] isn’t some particular method of recipe-like steps to follow. It doesn’t offer a template of standard questions. It’s a mindset of inquiry into how to brin ring data to bear r on what’s Michael Quinn Patton. unfolding so as to guide and develop the Developmental unfolding. What that means and the Evaluation. 2010: pp.75-6. timing of the inquiry will depend on the situation, context, people involved, and the fundamental principle of doing what makes sense for program development.

  29. # So Socia ial l inn innovators without experience, training and practice in evaluation are (very unlikely) to master evaluation design, but as a primary user of evaluation, they must productively participate in all the phases of an evaluation.

  30. An Evaluative Culture An organization with a strong evaluative culture:  engages in self-reflection and self-examination: • deliberately seeks evidence on what it is achieving, such as through monitoring and evaluation, • uses results information to challenge and support what it is doing, and • values candor, challenge and genuine dialogue;  embraces evidence-based learning: • makes time to learn in a structured fashion, • learns from mistakes and weak performance, and • encourages knowledge sharing;  encourages experimentation and change: • supports deliberate risk taking, and • seeks out new ways of doing business

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend