Network 17 th July 2019 Agenda Welcome & introductions - - PowerPoint PPT Presentation
Network 17 th July 2019 Agenda Welcome & introductions - - PowerPoint PPT Presentation
Regional Impact Network 17 th July 2019 Agenda Welcome & introductions Embedding evaluation into practice - training Break Embedding evaluation into practice action planning Creative ways to become a learning organisation London
Agenda
Welcome & introductions Embedding evaluation into practice - training Break Embedding evaluation into practice – action planning Creative ways to become a learning organisation – London Bubble Theatre What’s coming up? What do we want to see in the future? Free space/networking
Embedding evaluation into practice
- The characteristics of a good evaluation culture
- Evaluation ecosystems and how they remain strong and balanced
- Reflective practice and closing feedback loops to embed a culture of
evaluation into your organisation
Today we’ll cover
Evaluation culture: the ideas, customs and behaviours of your organization and those in it pertaining to why and how you gather and use evaluation data. Evaluation ecosystem: the different evaluation roles, often carried out by different people, that interact with each other (or not) to enable evaluation to happen effectively (or not)
Organisational engagement with evaluation
Engagement
http://www.camman-evaluation.com/blog/2015/11/25/better- program-evaluation-through-frontline-staff-engagement
Before starting an evaluation, be able to answer these questions:
- Why are you doing an evaluation?
- What do you hope your results tell you?
- What if you’re disappointed by the results?
Practitioners are often hesitate to take part in an evaluation because they’re worried about high stakes.
What’s at stake?
Planner: Setting evaluative goals and priorities, identifying and selecting appropriate evaluation methodologies. Participant/data contributor: Participating in surveys, interviews, and focus groups; generating program data; facilitating access to other stakeholder groups for the purpose of data collection (e.g., program recipients). User/ analyst: Being an intended user of the evaluation findings, reviewing deliverables and recommendations and making decisions about what steps to take based on this information. Implementer: Carrying out recommendations and any changes in operations or service delivery as a result of evaluation findings.
Common roles in evaluation
http://www.camman-evaluation.com/blog/2015/11/25/better- program-evaluation-through-frontline-staff-engagement
A broken evaluation ecosystem
Planner Participant/ data contributor User/ analyst Implementer
- Siloed working
- Evaluation plans aren’t
feasible or relevant
- Data collected is sparse and
not of high quality
- Recommendations aren’t
informed by good evidence
- Little incentive to make
changes to service delivery
- Confusion
- Frustration
A balanced evaluation ecosystem
culture of continuous learning and improvement
Planner Participant/ data contributor User/ analyst Implementer
Write down…. In your organisation
- Who in your organisation plans your evaluation?
- Who are your evaluation participants? Who are the people you collect data from?
- Who sees and uses the evaluation data? Who analyses and interprets it?
- Whose responsibility is it to implement evaluation findings and act on recommendations?
Your evaluation ecosystem
How do these people talk to each other?
- Does the planner consult the participants and user when choosing an
evaluation methodology?
- How does the user share the evaluation findings and recommendations with
the implementer?
- Are evaluation findings shared with the participants?
How balanced is your evaluation ecosystem?
culture of continuous learning and improvement
Planner Participant/ data contributor User/ analyst Implementer
Are there communication links in your evaluation ecosystem that could be improved? Which links require improvement? Are there individuals who feel like they contribute more than they gain from evaluation? Who are they? Who feels like they gain the most from evaluation?
Reflective practice
Description: What happened? Feelings: What were you thinking and feeling? Evaluation: What was good and bad about the experience? Analysis: What sense can you make
- ut of the
situation? Conclusion: What else could have you done? Action Plan: If it arose again, what would you do?
Gibbs Cycle of Reflection (1988)
What forms the basis of your organization’s reflective practice?
- Do you reflect on what went well, and what enabled that to happen?
- Do you reflect on incidents and how to avoid them in the future?
Are actions that come out of reflective practice
- Recorded?
- Re-visited?
Engagement data Quality data Feedback data
Using evaluation results to form the basis of reflective practice
Description: What happened? Feelings: What were you thinking and feeling? Evaluation: What was good and bad about the experience? Analysis: What sense can you make
- ut of the
situation? Conclusion: What else could have you done? Action Plan: If it arose again, what would you do?
You can also apply this cycle to your evaluation approach
Description: What happened? Feelings: What were you thinking and feeling? Evaluation: What was good and bad about the experience? Analysis: What sense can you make out of the situation? Conclusion: What else could have you done? Action Plan: What would you do the same and/or differently next time?
Reflect on…. Your experience gathering data from young people Young people’s reactions to being asked for data Your experience analysing evaluation data Your experience reading evaluation results Other’s responses to your evaluation data
Feedback Loops
Steps of a closed feedback loop
https://feedbacklabs.org/feedback-loops-steps/ Closing the loop with young people shows them you value their feedback and will make them more forthcoming with it. Also close the feedback loop with practitioners – what is changing for the better as a result of the evaluation? How could the evaluation process be changed for the better and lead to better evaluation?
- Do you find using the [name a specific tool] a good use
- f your time?
- What barriers are you facing in terms of evaluation?
- Do you feel the evaluation approach is aligned with
youth work values?
Getting feedback from your team about evaluation
- You said | We did boards
- Internal newsletter
- Feedback and actions as a team meeting agenda
Ways to close the feedback loop
Top tips
Im Impact culture: top tips
1. Agree your evaluation plan 2. Tell everyone about your evaluation plan 3. Get buy-in by involving people 4. Demonstrate the value of evaluation 5. Define roles and responsibilities 6. Recruit and train people 7. Think about incentives 8. Make data collection and use as easy as possible 9. Set up systems and processes for learning 10. Tolerate mistakes and failure
— Mistakes in data collection stick with it! You will get better — Data that shows no change or negative change have a plan for dealing with unexpected results, and change for the better
- Use measured language when you’re talking about results –
call it a Learning Report rather then an Evaluation or an Impact Report
- Set SMART targets for your evaluation – you may want to aim
for some small, but quick wins at the start and then build from that as your team gets more comfortable with collecting and using evidence
- Have evaluation written into people’s job descriptions and
discuss it in line management sessions
Managing risk
BREAK
Action planning
http://www.evaluationsupportscotland.org.uk/media/uploads/resources/final_making _it_stick.pdf
Making it Stick – Evaluation Support Scotland
Resources for embedding evaluation into your
- rganisation that includes:
- A diagnostic tool
- Suggestions for embedding evaluation across
the organization
- Action planning guide
Action Plan
How would you currently describe your evaluation culture? How would you like to describe your eval How you’ll get there: strengthen your evaluation ecosystem What do they already do well? What do they need to improve? What can be done about it? Timeline Planner Participant/ data contributor User/ analyst Implementer of learning
Action Plan
Your ecosystem:
- Are there communication links in your evaluation ecosystem that could be improved?
- Are there individuals who feel like they contribute more than they gain from evaluation?
- Who feels like they gain the most from the evaluation?
What could be improved?
- Confidence?
- Incentives?
- Guidance and support?
- Time?
What can be done about it?
- Training?
- Regular reflective practice sessions?
- Line management meetings covering evaluation?
- Closing feedback loops?
- Investment in tools and systems?
Any questions?
Adam Annand
London Bubble Theatre
Question Dump
- How do we move from evidence for advocacy to evidence to improve?
- What do we say that we are going to do?
- Is the thing that we are saying we want to do important to our participants?
- Are we clear about what we say we are doing?
- What about what we don’t do?
- How do we know other positive things that might be happening - That might be
different to the thing we say we are going to do?
- How do we know if there are any negative effects to what we are doing?
- How can this new knowledge be used to improve what we do?
This research programme.
Learning About Culture Aim 1: Build a stronger evidence base for cultural learning Aim 2: Improve the use of evidence in cultural learning
Some practice!
What are the benefits?
What are the benefits?
UEL comparison group study 2015/16. Statistically significant improvements in:
- Understanding spoken language
- Storytelling and narrative
- Social interaction
We don’t make a statistically significant impact on:
- Speech (pronunciation, diction, sound production)
- Sentence building
- Vocabulary
Dr J. Barnes - Arts and Health Study 2015.
- ‘… increased confidence results in greater motivation, attention, concentration and
friendliness… the development of these psychological dispositions appears closely related to increased capacity to learn, speak out, communicate and listen.’ School reporting 2016/2017:
- 88% of referred children showed improvement in their learning, speaking and listening.
- 90% of referred children showed improvement in their emotional behaviour and conduct
behaviour.
Young Theatre Makers
- SWEMWBS
- Observation Framework
- Life Effectiveness Questionnaire
- Personal journey
- Letter to future participants
What’s coming up? What do we want to see?
Next meetings
- 17th October 2019 – 2pm-4:30pm
- 8th January 2020 – 1pm-5pm
- All at Pitfield Street office
- Content? Ideas? Themes?