How to evaluate your project effectively Dr. Kion Ahadi Head of - - PowerPoint PPT Presentation

how to evaluate your project effectively
SMART_READER_LITE
LIVE PREVIEW

How to evaluate your project effectively Dr. Kion Ahadi Head of - - PowerPoint PPT Presentation

How to evaluate your project effectively Dr. Kion Ahadi Head of Evaluation 6 October 2016 BGEN Training Event Impact and Engagement: Going Forward with Evaluation About HLF Our strategic aims are to: conserve the UKs diverse


slide-1
SLIDE 1
slide-2
SLIDE 2

How to evaluate your project effectively

  • Dr. Kion Ahadi – Head of Evaluation

6 October 2016

BGEN Training Event

Impact and Engagement: Going Forward with Evaluation

slide-3
SLIDE 3
  • conserve the UK’s diverse heritage for present and

future generations to experience and enjoy;

  • help more people, and a wider range of people, to

take an active part in and make decisions about their heritage; and

  • help people to learn about their own and other

people’s heritage.

About HLF

Our strategic aims are to:

slide-4
SLIDE 4
  • Largest dedicated funder
  • f heritage in the UK
  • Leading advocate for the

value of heritage

  • £7.1bn awarded to over

40,000 projects since 1994

  • £430m to invest this year
  • Offices across the UK
  • Grants from £3,000
slide-5
SLIDE 5

Grant Making

Open programmes

Heritage Grants

  • grants over £5m
  • grants £1m – £5m
  • grants £50,000 – £1m

Targeted programmes

  • Kick the Dust
  • Parks for People
  • Landscape Partnerships
  • Skills for the Future
  • Townscape Heritage

Initiative

  • Repair Grants for Places
  • f Worship
slide-6
SLIDE 6

Cardiff Castle

slide-7
SLIDE 7
slide-8
SLIDE 8

West Wemyss

slide-9
SLIDE 9

Lister Park boat house and lake

slide-10
SLIDE 10

Why Evaluate?

Continuous evaluation helps us learn through:

  • Monitoring – is our strategy heading in the right

direction?

  • Justifying – is our programme value for money?
  • Validating – are we making the right funding

decisions?

  • Improving – can we improve if we change

something?

  • Researching – adding to our body of knowledge.
slide-11
SLIDE 11

Programme evaluation is a systematic and objective assessment of an

  • ngoing or completed programme. The

aim is to determine the relevance and level of achievement of programme

  • bjectives, effectiveness (outcomes

evaluation), efficiency (process evaluation), impact and sustainability.

slide-12
SLIDE 12

LOGIC MODEL

Heritage Grants’ grants programme

  • ffers funding from

£100,000- £5million for heritage projects anywhere in the UK.  Grant funding  Grantees partnership funding  Grantee’s in- kind contributions (management, volunteers etc.)  HLF mentors FY 2015/16 111 awards under this programme across:

  • Buildings and

monuments

  • Community

heritage

  • Cultures and

memories

  • Industrial,

maritime and transport

  • Land and natural

heritage

  • Museums,

libraries and archives Projects do:  Restoration & renovation  Archaeology  Preserving rare species and habitats  Developing long-lost skills  Building

Results achieved immediately after implementing an activity.

For heritage:

  • better managed
  • better condition
  • better interpreted

and explained

  • identified/recorded

For people:

  • Skills developed
  • Learnt about

heritage

  • changed their

attitudes and/or behaviour

  • enjoyable

experience

  • volunteered time

For communities

  • environmental

impacts will be reduced

  • wider range of

people will have engaged with heritage

  • local

area/community will be a better place to live, work or visit

  • local economy will

be boosted

  • Direct

employment

  • Purchases of

goods and services

  • Indirect and

induced effects

  • Total
  • perational

impacts

  • Visitor

expenditures

  • Quality of Life
slide-13
SLIDE 13

Types of Evaluation?

Process evaluation: to assess the types and quantities

  • f grants delivered, the beneficiaries of those funds, the

resources used to deliver the services, the practical problems encountered, and the ways such problems were resolved. Outcome evaluation: to assess the effectiveness of the programme in producing a lasting difference for heritage and people. Impact evaluation: to assess longer term economic and social benefits.

slide-14
SLIDE 14

Sources:

  • Self evaluation reports from projects funded
  • Surveys/Interviews with stakeholders (longitudinal for

impact assessment)

  • Data analysis/desk research
  • Case studies

Method:

Commission and Manage Independent Evaluation

How to Evaluate?

slide-15
SLIDE 15

“Good” Project Evaluation practice

  • A review of the grantee self-evaluation

process and the outcomes achieved by the first 100 projects (92 actual reports)

  • Comparative appraisal of the quality, scope

and methodology of the self-evaluated reports and the type, range and quality of activities and outcomes achieved by completed projects

slide-16
SLIDE 16

Criterion used to assess and grade evaluation reports

1. The evaluation provides a logical framework setting out linkages between activities, expected outputs and outcomes for all elements of the project (Telling the project story) 2. Appropriate and methodical ways of asking were used which provide robust evidence (Counting, involving, choosing indicators that matter) 3. Data was subject to robust analysis to provide evidence on

  • utcomes including coverage of well-being as well as

demographic, economic, social capital and quality of conservation issues where appropriate (Beyond counting) 4. The evaluation is objective and free from bias (Avoiding bias) 5. The results are clearly and sufficiently presented (Structuring the process of understanding) 6. The conclusions and recommendations are sufficiently clear to enable stakeholders to identify and apply any lessons learned (Improve not just prove)

slide-17
SLIDE 17

Percentage of evaluation reports by assessed grade

slide-18
SLIDE 18

Key Findings

  • Projects with evaluation reports scoring poor had a

higher median HLF grant and a higher overall median project cost than those whose evaluation reports were scored very good.

  • 1 in 5 reports less than 10 pages long
  • Only 17 reports mentioned having evaluation plans
  • 58% of the reports contained separate sections in

which there was some reflection on what lessons had been learnt, but not often transparent how the lessons learned would be disseminated

slide-19
SLIDE 19

Percentage of evaluation reports scoring very good/good or fair/poor by type of project

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Intangible Heritage Historic Buildings Land and Biodiversity Museums and Collections Industrial Maritime Transport

Very Good/Good Fair/Poor

  • The category of Museums,

Archives and Collections had 37% of its evaluation reports assessed as good or very good.

  • The Land and Biodiversity

and Historic Building categories both had less than a third of their evaluation reports assessed as very good or good.

  • The Industrial Maritime and

Transport category did not have any reports assessed as very good or good. However it should be noted that this category or project

  • nly included four reports

within the sample.

slide-20
SLIDE 20

The proportion of evaluation reports scored very good/good or fair/poor by type of organisation

  • All of the church and faith group

project evaluations were assessed as fair/poor and none very good/good.

  • About one third of local

authority led projects produced evaluation reports which were assessed as good or very good.

  • Almost half the projects led by
  • ther organisations in the public

sector produced evaluation reports which were assessed as very good or good.

  • Projects led in the community

and voluntary sectors had the highest proportion (nearly 60%)

  • f evaluation reports assessed

as very good or good.

slide-21
SLIDE 21

Budget: Main influence on quality

  • 55 of 92 projects

applied for budgets from 5% to 0.2% of total funding (HLF can contribute 3% of grant to evaluation)

  • Average budget for

very good reports was £7k for poor reports was £1k

Average HLF grant for evaluation by evaluation report score

Source : HLF Heritage Programme Database Analysis (92 self-evaluation reports)

£0 £1,000 £2,000 £3,000 £4,000 £5,000 £6,000 £7,000 1

Very Good Good Fair Poor

slide-22
SLIDE 22

The proportion of reports assessed as very good/good or fair/poor by whether compiled by project itself or independent contractor/advisor

  • Fifteen of the 92

reports and their associated evaluation were carried out by independent

  • consultants. Over 90%
  • f these reports were

assessed as very good

  • r good.
  • This compared to less

than 25% of the evaluations that were conducted in-house.

slide-23
SLIDE 23

Examples of good practice or innovation of approach to evaluation framework design

  • Built evaluation data collection in from the start by drawing up an evaluation plan

and implementing it during the course of the project

  • Made evaluation data collection as much part of the project as the delivery of the

activities and events

  • Ascribed responsibility for evaluation methods and organising collection to named

individual project members or volunteers

  • Asked local research students to support the project and design evaluation methods

and questionnaires

  • Organised a half day evaluation seminar at an early stage of the project to consider

methods, success measures and develop Key Evaluation Questions, templates and questionnaires.

slide-24
SLIDE 24

Example of best practice in setting

  • ut theory of change

Source : Archaeology for Communities in the Highland Social Accounts

A critical factor in a report being able to distinguish clearly between outputs and outcomes was the amount and strength of the qualitative evidence that the project had been able to collect from participants, stakeholders and the community on

  • utcome and

impact.

slide-25
SLIDE 25

Examples of innovative techniques in collecting evaluative evidence

Project(s) Technique Royal Cornwall Gallery Fusiliers Museum Asked volunteers to undertake spot observations of participants and record levels of time spent and levels of engagement in particular aspects of the exhibition Sandsfoot Castle Chedworth Dickens Museum Collected ratings and comments from TripAdvisor website A Town Unearthed Folkestone Installed touchscreens to encourage and facilitate completion of exit questionnaires Regal Cinema Tenbury Made random telephone calls to local residents to measure community recognition, support and engagement in the project Regal Cinema Tenbury Conduced a street survey to measure local recognition, support and engagement in the project Cheltenham Everyman Theatre Lightshaw Meadows Used the SurveyMonkey website to solicit evidence from participants post-project. Green Estate Conducted non-user surveys and city centre surveys to measure wider levels of awareness of the project

slide-26
SLIDE 26

Examples of weak exposition of “lessons learned”

“The original project plan has proved to be well constructed with only minor alterations” “Be ambitious but also a little more realistic about what can be achieved, and by when.” “Heritage projects can be difficult, demanding and time consuming. Our project was also rewarding, hugely interesting and fun. We are delighted at what has been achieved.” “We learned that involving apprentices and skills training is difficult, and that working with educational organisations is very time-consuming.” “Even in the most unpromising circumstances, significant positive cultural change is possible if well-planned or led, and with the right components” “Given funding challenges, it would have been easy for the project to have stalled or not started without the determination and commitment of the project team.”

slide-27
SLIDE 27

Examples of better exposition of “lessons learned”

“The project now needs to develop a new strategic framework with clearly defined and agreed priorities, including those relating to collections care, curatorial programming and audience development, staffing and resources, and research. A strategic approach will be essential in order to ensure the gallery’s long-term viability and success and this updated strategy will be the backbone for all future fundraising and activity. It would have been preferable to have defined this long-term strategy after the first year of the project, however, given the already stretched resources, this was not feasible at the time.” “Projects attempting to involve local schools need to make them the first point of contact well in advance of the start of the project. These contacts seem to have been left until last, anticipating that these would be the easiest group to target, but despite great efforts only two schools were recruited. For future projects it would be worth considering offering a complete package outlining all events, dates and benefits (taking into account of Government requirements in terms of EPQs etc) which should be delivered in a one-to-one meeting with relevant teachers.”

slide-28
SLIDE 28

Summary

  • Develop a theory of change (Logic Model)
  • Think about evaluation from the start and how

to collect data

  • Budget adequately for evaluation
  • Commission independent evaluation where

possible

  • Ensure reports include lessons learnt that are

meaningful

slide-29
SLIDE 29

Kion Ahadi Head of Evaluation Heritage Lottery Fund 7 Holbein Place London SW1W 8NR Email: Kion.Ahadi@hlf.org.uk Telephone: 020 7591 6000 Textphone: 020 7591 6255 www.hlf.org.uk

slide-30
SLIDE 30