network
play

Network 17 th July 2019 Agenda Welcome & introductions - PowerPoint PPT Presentation

Regional Impact Network 17 th July 2019 Agenda Welcome & introductions Embedding evaluation into practice - training Break Embedding evaluation into practice action planning Creative ways to become a learning organisation London


  1. Regional Impact Network 17 th July 2019

  2. Agenda Welcome & introductions Embedding evaluation into practice - training Break Embedding evaluation into practice – action planning Creative ways to become a learning organisation – London Bubble Theatre What’s coming up? What do we want to see in the future? Free space/networking

  3. Embedding evaluation into practice

  4. Today we’ll cover • The characteristics of a good evaluation culture • Evaluation ecosystems and how they remain strong and balanced • Reflective practice and closing feedback loops to embed a culture of evaluation into your organisation Evaluation culture: the ideas, customs and behaviours of your organization and those in it pertaining to why and how you gather and use evaluation data. Evaluation ecosystem: the different evaluation roles, often carried out by different people, that interact with each other (or not) to enable evaluation to happen effectively (or not)

  5. Organisational engagement with evaluation

  6. Engagement http://www.camman-evaluation.com/blog/2015/11/25/better- program-evaluation-through-frontline-staff-engagement

  7. What’s at stake? Practitioners are often hesitate to take part in an evaluation because they’re worried about high stakes . Before starting an evaluation, be able to answer these questions: • Why are you doing an evaluation? • What do you hope your results tell you? • What if you’re disappointed by the results?

  8. Common roles in evaluation Planner : Setting evaluative goals and priorities, identifying and selecting appropriate evaluation methodologies. Participant/data contributor : Participating in surveys, interviews, and focus groups; generating program data; facilitating access to other stakeholder groups for the purpose of data collection (e.g., program recipients). User/ analyst : Being an intended user of the evaluation findings, reviewing deliverables and recommendations and making decisions about what steps to take based on this information. Implementer : Carrying out recommendations and any changes in operations or service delivery as a result of evaluation findings. http://www.camman-evaluation.com/blog/2015/11/25/better- program-evaluation-through-frontline-staff-engagement

  9. A broken evaluation ecosystem • Siloed working • Evaluation plans aren’t feasible or relevant Participant/ data • Data collected is sparse and contributor Implementer not of high quality • Recommendations aren’t informed by good evidence Planner • Little incentive to make User/ analyst changes to service delivery • Confusion • Frustration

  10. A balanced evaluation ecosystem Planner culture of Participant/ continuous Implementer data learning and contributor improvement User/ analyst

  11. Your evaluation ecosystem Write down…. In your organisation • Who in your organisation plans your evaluation? • Who are your evaluation participants ? Who are the people you collect data from? • Who sees and uses the evaluation data? Who analyses and interprets it? • Whose responsibility is it to implement evaluation findings and act on recommendations? How do these people talk to each other? • Does the planner consult the participants and user when choosing an evaluation methodology? • How does the user share the evaluation findings and recommendations with the implementer ? • Are evaluation findings shared with the participants ?

  12. How balanced is your evaluation ecosystem? Planner Are there communication links in your evaluation ecosystem that could be improved? Which links require improvement? culture of Participant/ continuous Are there individuals who feel like they Implementer data learning and contributor contribute more than they gain from evaluation? improvement Who are they? Who feels like they gain the most from evaluation? User/ analyst

  13. Reflective practice

  14. Description: What happened? Action Plan: Feelings: If it arose What were again, what you thinking would you and feeling? do? Evaluation: Conclusion: What was What else good and bad could have about the you done? experience? Analysis: What sense can you make out of the situation? Gibbs Cycle of Reflection (1988)

  15. What forms the basis of your organization’s reflective practice? • Do you reflect on what went well, and what enabled that to happen? • Do you reflect on incidents and how to avoid them in the future? Are actions that come out of reflective practice • Recorded? • Re-visited?

  16. Using evaluation results to form the basis of reflective practice Description: What happened? Engagement data Action Plan: Feelings: If it arose What were again, what you thinking Quality data would you and feeling? do? Feedback data Evaluation: Conclusion: What was What else good and bad could have about the you done? experience? Analysis: What sense can you make out of the situation?

  17. You can also apply this cycle to your evaluation approach Reflect on…. Description: What happened? Your experience gathering data from Action Plan: young people What would you Feelings: What do the same were you and/or thinking and differently next feeling? Young people’s reactions to being asked time? for data Your experience analysing evaluation data Evaluation: Conclusion: What was good Your experience reading evaluation What else could and bad about have you done? the experience? results Analysis: What Other’s responses to your evaluation sense can you make out of the data situation?

  18. Feedback Loops

  19. Steps of a closed feedback loop Closing the loop with young people shows them you value their feedback and will make them more forthcoming with it. Also close the feedback loop with practitioners – what is changing for the better as a result of the evaluation? How could the evaluation process be changed for the better and lead to better evaluation? https://feedbacklabs.org/feedback-loops-steps/

  20. Getting feedback from your team about evaluation • Do you find using the [name a specific tool] a good use of your time? • What barriers are you facing in terms of evaluation? • Do you feel the evaluation approach is aligned with youth work values?

  21. Ways to close the feedback loop • You said | We did boards • Internal newsletter • Feedback and actions as a team meeting agenda

  22. Top tips

  23. Im Impact culture: top tips  1. Agree your evaluation plan  2. Tell everyone about your evaluation plan  3. Get buy-in by involving people  4. Demonstrate the value of evaluation  5. Define roles and responsibilities  6. Recruit and train people  7. Think about incentives  8. Make data collection and use as easy as possible  9. Set up systems and processes for learning  10. Tolerate mistakes and failure — Mistakes in data collection  stick with it! You will get better — Data that shows no change or negative change  have a plan for dealing with unexpected results, and change for the better

  24. Managing risk • Use measured language when you’re talking about results – call it a Learning Report rather then an Evaluation or an Impact Report • Set SMART targets for your evaluation – you may want to aim for some small, but quick wins at the start and then build from that as your team gets more comfortable with collecting and using evidence • Have evaluation written into people’s job descriptions and discuss it in line management sessions

  25. BREAK

  26. Action planning

  27. Making it Stick – Evaluation Support Scotland Resources for embedding evaluation into your organisation that includes: • A diagnostic tool • Suggestions for embedding evaluation across the organization • Action planning guide http://www.evaluationsupportscotland.org.uk/media/uploads/resources/final_making _it_stick.pdf

  28. Action Plan How would you currently describe your evaluation How would you like to describe your eval culture? How you’ll get there: strengthen your evaluation ecosystem What do they already What do they need to What can be done Timeline do well? improve? about it? Planner Participant/ data contributor User/ analyst Implementer of learning

  29. Action Plan Your ecosystem: • Are there communication links in your evaluation ecosystem that could be improved? • Are there individuals who feel like they contribute more than they gain from evaluation? • Who feels like they gain the most from the evaluation? What could be improved? • Confidence? • Incentives? • Guidance and support? • Time? What can be done about it? • Training? • Regular reflective practice sessions? • Line management meetings covering evaluation? • Closing feedback loops? • Investment in tools and systems?

  30. Any questions?

  31. Adam Annand London Bubble Theatre

  32. Question Dump • How do we move from evidence for advocacy to evidence to improve? • What do we say that we are going to do? • Is the thing that we are saying we want to do important to our participants? • Are we clear about what we say we are doing? • What about what we don’t do? • How do we know other positive things that might be happening - That might be different to the thing we say we are going to do? • How do we know if there are any negative effects to what we are doing? • How can this new knowledge be used to improve what we do?

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend