how to evaluate your project effectively
play

How to evaluate your project effectively Dr. Kion Ahadi Head of - PowerPoint PPT Presentation

How to evaluate your project effectively Dr. Kion Ahadi Head of Evaluation 6 October 2016 BGEN Training Event Impact and Engagement: Going Forward with Evaluation About HLF Our strategic aims are to: conserve the UKs diverse


  1. How to evaluate your project effectively Dr. Kion Ahadi – Head of Evaluation 6 October 2016 BGEN Training Event Impact and Engagement: Going Forward with Evaluation

  2. About HLF Our strategic aims are to: • conserve the UK’s diverse heritage for present and future generations to experience and enjoy; • help more people, and a wider range of people, to take an active part in and make decisions about their heritage; and • help people to learn about their own and other people’s heritage.

  3. • Largest dedicated funder of heritage in the UK • Leading advocate for the value of heritage • £7.1bn awarded to over 40,000 projects since 1994 • £430m to invest this year • Offices across the UK • Grants from £3,000

  4. Grant Making Open programmes Targeted programmes • Kick the Dust Heritage Grants • grants over £5m • Parks for People • grants £1m – £5m • Landscape Partnerships • grants £50,000 – £1m • Skills for the Future • Townscape Heritage Initiative • Repair Grants for Places of Worship

  5. Cardiff Castle

  6. West Wemyss

  7. Lister Park boat house and lake

  8. Why Evaluate? Continuous evaluation helps us learn through: • Monitoring – is our strategy heading in the right direction? • Justifying – is our programme value for money? • Validating – are we making the right funding decisions? • Improving – can we improve if we change something? • Researching – adding to our body of knowledge.

  9. Programme evaluation is a systematic and objective assessment of an ongoing or completed programme. The aim is to determine the relevance and level of achievement of programme objectives, effectiveness (outcomes evaluation), efficiency (process evaluation), impact and sustainability.

  10. LOGIC MODEL Heritage Grants’ • For heritage: FY 2015/16 111 Results achieved Direct • better managed grants programme awards under this immediately after employment • better condition offers funding from programme implementing an • better interpreted • £100,000- £5million across: activity. Purchases of and explained • for heritage projects Buildings and goods and • identified/recorded monuments anywhere in the UK. services • For people: Community • Skills developed heritage  • Grant funding Indirect and • Learnt about • Cultures and  induced Grantees heritage memories effects • changed their partnership • Industrial, attitudes and/or maritime and funding behaviour • Total transport  Grantee’s in - • enjoyable operational • Land and natural experience kind impacts heritage • volunteered time contributions • Museums, • For communities libraries and Visitor (management, • environmental archives expenditures volunteers etc.) impacts will be reduced Projects do:  • Quality of Life HLF mentors • wider range of  Restoration & people will have renovation engaged with  Archaeology heritage • local  Preserving rare area/community will species and be a better place to habitats live, work or visit  • local economy will Developing be boosted long-lost skills  Building

  11. Types of Evaluation? Process evaluation : to assess the types and quantities of grants delivered, the beneficiaries of those funds, the resources used to deliver the services, the practical problems encountered, and the ways such problems were resolved. Outcome evaluation : to assess the effectiveness of the programme in producing a lasting difference for heritage and people. Impact evaluation : to assess longer term economic and social benefits.

  12. How to Evaluate? Sources: • Self evaluation reports from projects funded • Surveys/Interviews with stakeholders (longitudinal for impact assessment) • Data analysis/desk research • Case studies Method: Commission and Manage Independent Evaluation

  13. “ Good” Project Evaluation practice • A review of the grantee self-evaluation process and the outcomes achieved by the first 100 projects (92 actual reports) • Comparative appraisal of the quality, scope and methodology of the self-evaluated reports and the type, range and quality of activities and outcomes achieved by completed projects

  14. Criterion used to assess and grade evaluation reports 1. The evaluation provides a logical framework setting out linkages between activities, expected outputs and outcomes for all elements of the project (Telling the project story) 2. Appropriate and methodical ways of asking were used which provide robust evidence (Counting, involving, choosing indicators that matter) 3. Data was subject to robust analysis to provide evidence on outcomes including coverage of well-being as well as demographic, economic, social capital and quality of conservation issues where appropriate (Beyond counting) 4. The evaluation is objective and free from bias (Avoiding bias) 5. The results are clearly and sufficiently presented (Structuring the process of understanding) 6. The conclusions and recommendations are sufficiently clear to enable stakeholders to identify and apply any lessons learned (Improve not just prove)

  15. Percentage of evaluation reports by assessed grade

  16. Key Findings • Projects with evaluation reports scoring poor had a higher median HLF grant and a higher overall median project cost than those whose evaluation reports were scored very good. • 1 in 5 reports less than 10 pages long • Only 17 reports mentioned having evaluation plans • 58% of the reports contained separate sections in which there was some reflection on what lessons had been learnt, but not often transparent how the lessons learned would be disseminated

  17. Percentage of evaluation reports scoring very good/good or fair/poor by type of project • The category of Museums, Archives and Collections had 37% of its evaluation reports assessed as good or very Industrial Maritime Transport good. • The Land and Biodiversity and Historic Building Museums and Collections categories both had less than a third of their evaluation reports assessed as very Land and Biodiversity good or good. • The Industrial Maritime and Transport category did not Historic Buildings have any reports assessed as very good or good. However it should be noted that this category or project Intangible Heritage only included four reports Very Good/Good Fair/Poor within the sample. 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

  18. The proportion of evaluation reports scored very good/good or fair/poor by type of organisation • All of the church and faith group project evaluations were assessed as fair/poor and none very good/good. • About one third of local authority led projects produced evaluation reports which were assessed as good or very good. • Almost half the projects led by other organisations in the public sector produced evaluation reports which were assessed as very good or good. • Projects led in the community and voluntary sectors had the highest proportion (nearly 60%) of evaluation reports assessed as very good or good.

  19. Budget: Main influence on quality Average HLF grant for evaluation by evaluation report score £7,000  55 of 92 projects applied for budgets £6,000 from 5% to 0.2% of £5,000 total funding (HLF can £4,000 contribute 3% of grant to evaluation) £3,000  Average budget for £2,000 very good reports was £1,000 £7k for poor reports was £1k £0 1 Very Good Good Fair Poor Source : HLF Heritage Programme Database Analysis (92 self-evaluation reports)

  20. The proportion of reports assessed as very good/good or fair/poor by whether compiled by project itself or independent contractor/advisor • Fifteen of the 92 reports and their associated evaluation were carried out by independent consultants. Over 90% of these reports were assessed as very good or good. • This compared to less than 25% of the evaluations that were conducted in-house.

  21. Examples of good practice or innovation of approach to evaluation framework design • Built evaluation data collection in from the start by drawing up an evaluation plan and implementing it during the course of the project • Made evaluation data collection as much part of the project as the delivery of the activities and events • Ascribed responsibility for evaluation methods and organising collection to named individual project members or volunteers • Asked local research students to support the project and design evaluation methods and questionnaires • Organised a half day evaluation seminar at an early stage of the project to consider methods, success measures and develop Key Evaluation Questions, templates and questionnaires.

  22. Example of best practice in setting out theory of change A critical factor in a report being able to distinguish clearly between outputs and outcomes was the amount and strength of the qualitative evidence that the project had been able to collect from participants, stakeholders and the community on outcome and impact. Source : Archaeology for Communities in the Highland Social Accounts

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend