doing program evaluation sue dyson
play

Doing Program Evaluation Sue Dyson The Australian Research Centre - PowerPoint PPT Presentation

Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University Aims Aims This workshop will: Provide information about evaluation approaches, including logic models Provide


  1. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University

  2. Aims Aims • This workshop will: – Provide information about evaluation approaches, including logic models – Provide opportunities to think about approaches to self-evaluation and practice developing program logic models.

  3. Why evaluate te? • Increased external demands for accountability • To understand what works and what does not work in a program, and why. • To learn from mistakes and build on strengths. • To inform future planning.

  4. De Defining Ev Evaluati tion • Evaluation is the process by which we judge the worth or value of something (Suchman, 1967). • … a critical component of every effective program … one step of an ongoing process of planning, implementation and review. It is a way of checking that a program is delivering the results that it set out to achieve (PADV, 2000). • … a continuous process of asking questions, reflecting on the answers to these questions and reviewing your ongoing strategy and action (National Mental Health Promotion and Prevention Working Party, 2001).

  5. Ev Evaluati ting social programs • Full range of research methodologies available • Takes place within a political and organizational context. • Influenced by external factors which cannot always be accounted for or measured using ‘objective’ measures • Requires skills including sensitivity to multiple stakeholders and awareness of political context. • Formative (process), Summative (outcomes), Impact

  6. Formati tive Ev Evaluati tion • A method of judging the worth of a program while the program activities are forming or happening. • Focuses on the processes as they develop . • Aims to understanding the development and implementation of a project.

  7. Formati tive evaluati tion data ta • Key stakeholders engage in regular reflection • Interviews (participants, key informants) • Focus groups • Forums and discussion groups • Observations from fieldwork • Case Studies

  8. Summati tive Ev Evaluati tion • Examines whether targets have been achieved. • Information from a summative evaluation may include: – Numbers, Knowledge and attitude changes, Short-term or intermediate behaviour shifts, Policies initiated or other institutional changes, Resources produced.

  9. Summati tive evaluati tion data ta • Surveys (baseline and post intervention) • Metrics • Interviews, focus groups • Observation • Any of the evaluation tools or methods available to answer your evaluation questions.

  10. Impact t Ev Evaluati tion • Assesses intended and unintended changes as a result of an intervention • Usually some time after it occurred. • Seeks to understand whether changes have been sustained. • Impact evaluations seek to answer cause-and-effect questions: are outcomes directly attributable to a program.

  11. Ev Evaluati tion and eth thics • Ethical evaluation is based on a relationship of trust, mutual responsibility and ethical equality in which respect is central. • Should view the individuals involved as ‘participants’ rather than ‘subjects’. • Should not impose a burden on participants. • Should provide informed consent. • Should respect and protect participant privacy and ensure anonymity as far as possible. • Should not expose the evaluator to risk.

  12. Principles of eth thical research • Research should have merit • Be just • Must do no harm or cause discomfort to the participants or the wider community. • National Health and Medical Research Guidelines: http://www.nhmrc.gov.au/publications/ synopses/_files/e72.pdf

  13. It’ t’s about t more th than metr trics! • Effective program evaluation does more than collect, analyze and provide quantitative data. • It makes it possible for program stakeholders to gather and use information, to learn continually about, and improve programs • To tell the story behind the project and what was learned along the way. • Important to document learning about things that do not work and why.

  14. Inte ternal Ev Evaluati tion • You can do it yourself • You should use it with every project • Helps to keep your project on track • Helps you acquire data to report to funding bodies or apply for further funding • Sometimes seen as lacking rigour.

  15. Ex Exte ternal Ev Evaluati tion • Researcher who is skilled & experienced in research and evaluation methods • Rigour implicit • Ethics approval • Objective outsider as opposed to subjective insider • 4 th generation evaluation contributes to continuous improvement as the project develops

  16. Inte ternal vs. exte ternal evaluati tion Internal External Uses expertise of project Loses expertise of project workers workers Reflexive and responsive May be less reflexive Less expensive More expensive Sometimes seen as subjective Seen as credible & objective Ethical practice not monitored Ethics approval required (for universities)

  17. Log Logic Models ic Models • A conceptual map for a programs, project or intervention • A framework for action • A program theory, a theory of change • A way of graphically depicting what you want to achieve and how you will do it. • Should provide logic and clarity by presenting the big picture of planned changes along with the details of how you will get there.

  18. Th Theory eory • A set of statements or principles that explain a group of facts, that has been tested or widely accepted and can be used to make predictions about phenomena. • Abstract reasoning; speculation: a decision based on experience rather than theory. • A principle that guides action or assists comprehension or judgment: staked out the house on the theory that criminals usually return to the scene of the crime. • An assumption based on limited information or knowledge; a conjecture.

  19. Th Theory of eory of ch chan ange e • Building blocks required to bring about a given long-term goal • Specific and measurable description of a social change initiative that forms the basis for strategic planning • Should be clear on long-term goals, identify measurable indicators of success, and formulate actions to achieve goals (objectives).

  20. Key points ts • Clarify language and use it consistently • Start with goals • Link intended effects with goals • Be prepared for unexpected effects • There is no ‘correct’ way of depicting a logic model

  21. Program ¡or ¡interven-on ¡goal ¡ Inputs/resources ¡ Outputs/ ¡Ac-vi-es ¡ Outcomes/impact ¡ ¡ What ¡we ¡do: ¡ Who ¡we ¡ Short ¡ Med. ¡ Long ¡ Staff ¡ ¡ reach: ¡ ¡ Learning ¡ Behaviour, ¡ Sustained ¡ Volunteers ¡ Workshops, ¡ ¡ ¡ ¡ behaviour ¡ Knowledge ¡ Prac-ces, ¡ change ¡ mee-ngs, ¡ Clients, ¡ Time ¡ ¡ ¡ ¡ Services, ¡ Agencies, ¡ APtudes ¡ Decision-­‑ ¡ Social/ ¡ Money ¡ Products, ¡ Decision ¡ ¡ making, ¡ economic/ ¡ Skills ¡ ¡ civic ¡ Training ¡, ¡ makers, ¡ Evidence ¡base ¡ ¡ Policies, ¡ condi-ons ¡ ¡ Resources, ¡ Service ¡ Awareness ¡ ¡ ¡ Equipment ¡ Facilitate, ¡ users, ¡ ¡ Social ¡ Cultural ¡ Partner, ¡ Etc. ¡ Opinions ¡ ac-ons. ¡ change ¡ Technology ¡ ¡ ¡ Aspira-ons ¡ ¡ Sustained ¡ partners ¡ ¡ knowledge ¡ ¡ ¡ Assump-ons ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡Context ¡or ¡condi-ons ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡external ¡factors ¡

  22. When to to use a logic model • During planning • During program/project implementation • During staff/stakeholder orientation • During evaluation

  23. Limita tati tions of logic modelling • Can be time consuming and do not account for unintended consequences • Can become onerous – needs a balance between complexity and over simplification • No guarantee of logic (or of success), must be plausible & feasible • Must be a work in progress: a continuous process of intentional review and revision.

  24. Benefits ts of a logic models • They integrate planning, implementation and evaluation • Prevent mismatches between activities and effects • Leverage the power of partnerships • Enhance accountability

  25. Ev Evaluati tion Resources • Community Toolbox: http://ctb.ku.edu/en/table-of- contents/overview/model-for- community-change-and-improvement • Better Evaluation: http://betterevaluation.org/

  26. Pr Progr gram m go goal l • Goal: To build the capacity and skills of students in year 10 to – Identify behaviours associated with respectful relationships – Identify attitudes and behaviours that underpin and perpetuate gender inequality

  27. Tas Task k • Working in a group start to develop a logic model to plan for program implementation and evaluation. • Feel free to change the goal, but don’t spend too much time on this. • ‘what you will do’ represents the actions, or objectives of the project: – Should be related to achieving the goal – Outcomes should relate to the indicators for change.

  28. Proxy indicato tors • Community

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend