Network 17 th July 2019 Agenda Welcome & introductions - - PowerPoint PPT Presentation

network
SMART_READER_LITE
LIVE PREVIEW

Network 17 th July 2019 Agenda Welcome & introductions - - PowerPoint PPT Presentation

Regional Impact Network 17 th July 2019 Agenda Welcome & introductions Embedding evaluation into practice - training Break Embedding evaluation into practice action planning Creative ways to become a learning organisation London


slide-1
SLIDE 1

Regional Impact Network

17th July 2019

slide-2
SLIDE 2

Agenda

Welcome & introductions Embedding evaluation into practice - training Break Embedding evaluation into practice – action planning Creative ways to become a learning organisation – London Bubble Theatre What’s coming up? What do we want to see in the future? Free space/networking

slide-3
SLIDE 3

Embedding evaluation into practice

slide-4
SLIDE 4
  • The characteristics of a good evaluation culture
  • Evaluation ecosystems and how they remain strong and balanced
  • Reflective practice and closing feedback loops to embed a culture of

evaluation into your organisation

Today we’ll cover

Evaluation culture: the ideas, customs and behaviours of your organization and those in it pertaining to why and how you gather and use evaluation data. Evaluation ecosystem: the different evaluation roles, often carried out by different people, that interact with each other (or not) to enable evaluation to happen effectively (or not)

slide-5
SLIDE 5

Organisational engagement with evaluation

slide-6
SLIDE 6

Engagement

http://www.camman-evaluation.com/blog/2015/11/25/better- program-evaluation-through-frontline-staff-engagement

slide-7
SLIDE 7

Before starting an evaluation, be able to answer these questions:

  • Why are you doing an evaluation?
  • What do you hope your results tell you?
  • What if you’re disappointed by the results?

Practitioners are often hesitate to take part in an evaluation because they’re worried about high stakes.

What’s at stake?

slide-8
SLIDE 8

Planner: Setting evaluative goals and priorities, identifying and selecting appropriate evaluation methodologies. Participant/data contributor: Participating in surveys, interviews, and focus groups; generating program data; facilitating access to other stakeholder groups for the purpose of data collection (e.g., program recipients). User/ analyst: Being an intended user of the evaluation findings, reviewing deliverables and recommendations and making decisions about what steps to take based on this information. Implementer: Carrying out recommendations and any changes in operations or service delivery as a result of evaluation findings.

Common roles in evaluation

http://www.camman-evaluation.com/blog/2015/11/25/better- program-evaluation-through-frontline-staff-engagement

slide-9
SLIDE 9

A broken evaluation ecosystem

Planner Participant/ data contributor User/ analyst Implementer

  • Siloed working
  • Evaluation plans aren’t

feasible or relevant

  • Data collected is sparse and

not of high quality

  • Recommendations aren’t

informed by good evidence

  • Little incentive to make

changes to service delivery

  • Confusion
  • Frustration
slide-10
SLIDE 10

A balanced evaluation ecosystem

culture of continuous learning and improvement

Planner Participant/ data contributor User/ analyst Implementer

slide-11
SLIDE 11

Write down…. In your organisation

  • Who in your organisation plans your evaluation?
  • Who are your evaluation participants? Who are the people you collect data from?
  • Who sees and uses the evaluation data? Who analyses and interprets it?
  • Whose responsibility is it to implement evaluation findings and act on recommendations?

Your evaluation ecosystem

How do these people talk to each other?

  • Does the planner consult the participants and user when choosing an

evaluation methodology?

  • How does the user share the evaluation findings and recommendations with

the implementer?

  • Are evaluation findings shared with the participants?
slide-12
SLIDE 12

How balanced is your evaluation ecosystem?

culture of continuous learning and improvement

Planner Participant/ data contributor User/ analyst Implementer

Are there communication links in your evaluation ecosystem that could be improved? Which links require improvement? Are there individuals who feel like they contribute more than they gain from evaluation? Who are they? Who feels like they gain the most from evaluation?

slide-13
SLIDE 13

Reflective practice

slide-14
SLIDE 14

Description: What happened? Feelings: What were you thinking and feeling? Evaluation: What was good and bad about the experience? Analysis: What sense can you make

  • ut of the

situation? Conclusion: What else could have you done? Action Plan: If it arose again, what would you do?

Gibbs Cycle of Reflection (1988)

slide-15
SLIDE 15

What forms the basis of your organization’s reflective practice?

  • Do you reflect on what went well, and what enabled that to happen?
  • Do you reflect on incidents and how to avoid them in the future?

Are actions that come out of reflective practice

  • Recorded?
  • Re-visited?
slide-16
SLIDE 16

Engagement data Quality data Feedback data

Using evaluation results to form the basis of reflective practice

Description: What happened? Feelings: What were you thinking and feeling? Evaluation: What was good and bad about the experience? Analysis: What sense can you make

  • ut of the

situation? Conclusion: What else could have you done? Action Plan: If it arose again, what would you do?

slide-17
SLIDE 17

You can also apply this cycle to your evaluation approach

Description: What happened? Feelings: What were you thinking and feeling? Evaluation: What was good and bad about the experience? Analysis: What sense can you make out of the situation? Conclusion: What else could have you done? Action Plan: What would you do the same and/or differently next time?

Reflect on…. Your experience gathering data from young people Young people’s reactions to being asked for data Your experience analysing evaluation data Your experience reading evaluation results Other’s responses to your evaluation data

slide-18
SLIDE 18

Feedback Loops

slide-19
SLIDE 19

Steps of a closed feedback loop

https://feedbacklabs.org/feedback-loops-steps/ Closing the loop with young people shows them you value their feedback and will make them more forthcoming with it. Also close the feedback loop with practitioners – what is changing for the better as a result of the evaluation? How could the evaluation process be changed for the better and lead to better evaluation?

slide-20
SLIDE 20
  • Do you find using the [name a specific tool] a good use
  • f your time?
  • What barriers are you facing in terms of evaluation?
  • Do you feel the evaluation approach is aligned with

youth work values?

Getting feedback from your team about evaluation

slide-21
SLIDE 21
  • You said | We did boards
  • Internal newsletter
  • Feedback and actions as a team meeting agenda

Ways to close the feedback loop

slide-22
SLIDE 22

Top tips

slide-23
SLIDE 23

Im Impact culture: top tips

1. Agree your evaluation plan 2. Tell everyone about your evaluation plan 3. Get buy-in by involving people 4. Demonstrate the value of evaluation 5. Define roles and responsibilities 6. Recruit and train people 7. Think about incentives 8. Make data collection and use as easy as possible 9. Set up systems and processes for learning 10. Tolerate mistakes and failure

— Mistakes in data collection  stick with it! You will get better — Data that shows no change or negative change  have a plan for dealing with unexpected results, and change for the better

slide-24
SLIDE 24
  • Use measured language when you’re talking about results –

call it a Learning Report rather then an Evaluation or an Impact Report

  • Set SMART targets for your evaluation – you may want to aim

for some small, but quick wins at the start and then build from that as your team gets more comfortable with collecting and using evidence

  • Have evaluation written into people’s job descriptions and

discuss it in line management sessions

Managing risk

slide-25
SLIDE 25

BREAK

slide-26
SLIDE 26

Action planning

slide-27
SLIDE 27

http://www.evaluationsupportscotland.org.uk/media/uploads/resources/final_making _it_stick.pdf

Making it Stick – Evaluation Support Scotland

Resources for embedding evaluation into your

  • rganisation that includes:
  • A diagnostic tool
  • Suggestions for embedding evaluation across

the organization

  • Action planning guide
slide-28
SLIDE 28

Action Plan

How would you currently describe your evaluation culture? How would you like to describe your eval How you’ll get there: strengthen your evaluation ecosystem What do they already do well? What do they need to improve? What can be done about it? Timeline Planner Participant/ data contributor User/ analyst Implementer of learning

slide-29
SLIDE 29

Action Plan

Your ecosystem:

  • Are there communication links in your evaluation ecosystem that could be improved?
  • Are there individuals who feel like they contribute more than they gain from evaluation?
  • Who feels like they gain the most from the evaluation?

What could be improved?

  • Confidence?
  • Incentives?
  • Guidance and support?
  • Time?

What can be done about it?

  • Training?
  • Regular reflective practice sessions?
  • Line management meetings covering evaluation?
  • Closing feedback loops?
  • Investment in tools and systems?
slide-30
SLIDE 30

Any questions?

slide-31
SLIDE 31

Adam Annand

London Bubble Theatre

slide-32
SLIDE 32
slide-33
SLIDE 33

Question Dump

  • How do we move from evidence for advocacy to evidence to improve?
  • What do we say that we are going to do?
  • Is the thing that we are saying we want to do important to our participants?
  • Are we clear about what we say we are doing?
  • What about what we don’t do?
  • How do we know other positive things that might be happening - That might be

different to the thing we say we are going to do?

  • How do we know if there are any negative effects to what we are doing?
  • How can this new knowledge be used to improve what we do?
slide-34
SLIDE 34

This research programme.

Learning About Culture Aim 1: Build a stronger evidence base for cultural learning Aim 2: Improve the use of evidence in cultural learning

slide-35
SLIDE 35

Some practice!

slide-36
SLIDE 36
slide-37
SLIDE 37

What are the benefits?

slide-38
SLIDE 38
slide-39
SLIDE 39
slide-40
SLIDE 40
slide-41
SLIDE 41

What are the benefits?

UEL comparison group study 2015/16. Statistically significant improvements in:

  • Understanding spoken language
  • Storytelling and narrative
  • Social interaction

We don’t make a statistically significant impact on:

  • Speech (pronunciation, diction, sound production)
  • Sentence building
  • Vocabulary

Dr J. Barnes - Arts and Health Study 2015.

  • ‘… increased confidence results in greater motivation, attention, concentration and

friendliness… the development of these psychological dispositions appears closely related to increased capacity to learn, speak out, communicate and listen.’ School reporting 2016/2017:

  • 88% of referred children showed improvement in their learning, speaking and listening.
  • 90% of referred children showed improvement in their emotional behaviour and conduct

behaviour.

slide-42
SLIDE 42
slide-43
SLIDE 43

Young Theatre Makers

  • SWEMWBS
  • Observation Framework
  • Life Effectiveness Questionnaire
  • Personal journey
  • Letter to future participants
slide-44
SLIDE 44
slide-45
SLIDE 45

What’s coming up? What do we want to see?

slide-46
SLIDE 46

Next meetings

  • 17th October 2019 – 2pm-4:30pm
  • 8th January 2020 – 1pm-5pm
  • All at Pitfield Street office
  • Content? Ideas? Themes?