Doing Program Evaluation Sue Dyson The Australian Research Centre - - PowerPoint PPT Presentation

doing program evaluation sue dyson
SMART_READER_LITE
LIVE PREVIEW

Doing Program Evaluation Sue Dyson The Australian Research Centre - - PowerPoint PPT Presentation

Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University Aims Aims This workshop will: Provide information about evaluation approaches, including logic models Provide


slide-1
SLIDE 1

Doing Program Evaluation Sue Dyson

The Australian Research Centre in Sex, Health & Society, La Trobe University

slide-2
SLIDE 2

Aims Aims

  • This workshop will:

– Provide information about evaluation approaches, including logic models – Provide opportunities to think about approaches to self-evaluation and practice developing program logic models.

slide-3
SLIDE 3

Why evaluate te?

  • Increased external demands for

accountability

  • To understand what works and what

does not work in a program, and why.

  • To learn from mistakes and build on

strengths.

  • To inform future planning.
slide-4
SLIDE 4

De Defining Ev Evaluati tion

  • Evaluation is the process by which we judge the worth
  • r value of something (Suchman, 1967).
  • …a critical component of every effective program…one

step of an ongoing process of planning, implementation and review. It is a way of checking that a program is delivering the results that it set out to achieve (PADV, 2000).

  • …a continuous process of asking questions, reflecting
  • n the answers to these questions and reviewing your
  • ngoing strategy and action (National Mental Health Promotion and

Prevention Working Party, 2001).

slide-5
SLIDE 5

Ev Evaluati ting social programs

  • Full range of research methodologies

available

  • Takes place within a political and
  • rganizational context.
  • Influenced by external factors which

cannot always be accounted for or measured using ‘objective’ measures

  • Requires skills including sensitivity to

multiple stakeholders and awareness of political context.

  • Formative (process), Summative

(outcomes), Impact

slide-6
SLIDE 6

Formati tive Ev Evaluati tion

  • A method of judging the worth of a

program while the program activities are forming or happening.

  • Focuses on the processes as they develop.
  • Aims to understanding the development

and implementation of a project.

slide-7
SLIDE 7

Formati tive evaluati tion data ta

  • Key stakeholders engage in regular

reflection

  • Interviews (participants, key

informants)

  • Focus groups
  • Forums and discussion groups
  • Observations from fieldwork
  • Case Studies
slide-8
SLIDE 8

Summati tive Ev Evaluati tion

  • Examines whether targets have been

achieved.

  • Information from a summative evaluation

may include: – Numbers, Knowledge and attitude changes, Short-term or intermediate behaviour shifts, Policies initiated or

  • ther institutional changes, Resources

produced.

slide-9
SLIDE 9

Summati tive evaluati tion data ta

  • Surveys (baseline and post

intervention)

  • Metrics
  • Interviews, focus groups
  • Observation
  • Any of the evaluation tools or methods

available to answer your evaluation questions.

slide-10
SLIDE 10

Impact t Ev Evaluati tion

  • Assesses intended and unintended

changes as a result of an intervention

  • Usually some time after it occurred.
  • Seeks to understand whether changes

have been sustained.

  • Impact evaluations seek to answer

cause-and-effect questions: are

  • utcomes directly attributable to a

program.

slide-11
SLIDE 11

Ev Evaluati tion and eth thics

  • Ethical evaluation is based on a relationship
  • f trust, mutual responsibility and ethical

equality in which respect is central.

  • Should view the individuals involved as

‘participants’ rather than ‘subjects’.

  • Should not impose a burden on participants.
  • Should provide informed consent.
  • Should respect and protect participant privacy

and ensure anonymity as far as possible.

  • Should not expose the evaluator to risk.
slide-12
SLIDE 12

Principles of eth thical research

  • Research should have merit
  • Be just
  • Must do no harm or cause discomfort to the

participants or the wider community.

  • National Health and Medical Research

Guidelines: http://www.nhmrc.gov.au/publications/ synopses/_files/e72.pdf

slide-13
SLIDE 13

It’ t’s about t more th than metr trics!

  • Effective program evaluation does more

than collect, analyze and provide quantitative data.

  • It makes it possible for program

stakeholders to gather and use information, to learn continually about, and improve programs

  • To tell the story behind the project and

what was learned along the way.

  • Important to document learning about

things that do not work and why.

slide-14
SLIDE 14

Inte ternal Ev Evaluati tion

  • You can do it yourself
  • You should use it with every project
  • Helps to keep your project on track
  • Helps you acquire data to report to

funding bodies or apply for further funding

  • Sometimes seen as lacking rigour.
slide-15
SLIDE 15

Ex Exte ternal Ev Evaluati tion

  • Researcher who is skilled & experienced

in research and evaluation methods

  • Rigour implicit
  • Ethics approval
  • Objective outsider as opposed to

subjective insider

  • 4th generation evaluation contributes to

continuous improvement as the project develops

slide-16
SLIDE 16

Inte ternal vs. exte ternal evaluati tion

Internal External

Uses expertise of project workers Loses expertise of project workers Reflexive and responsive May be less reflexive Less expensive More expensive Sometimes seen as subjective Seen as credible & objective Ethical practice not monitored Ethics approval required (for universities)

slide-17
SLIDE 17

Log Logic Models ic Models

  • A conceptual map for a programs,

project or intervention

  • A framework for action
  • A program theory, a theory of change
  • A way of graphically depicting what

you want to achieve and how you will do it.

  • Should provide logic and clarity by

presenting the big picture of planned changes along with the details of how you will get there.

slide-18
SLIDE 18

Th Theory eory

  • A set of statements or principles that explain

a group of facts, that has been tested or widely accepted and can be used to make predictions about phenomena.

  • Abstract reasoning; speculation: a decision

based on experience rather than theory.

  • A principle that guides action or assists

comprehension or judgment: staked out the house on the theory that criminals usually return to the scene of the crime.

  • An assumption based on limited information
  • r knowledge; a conjecture.
slide-19
SLIDE 19

Th Theory of eory of ch chan ange e

  • Building blocks required to bring

about a given long-term goal

  • Specific and measurable description of

a social change initiative that forms the basis for strategic planning

  • Should be clear on long-term goals,

identify measurable indicators of success, and formulate actions to achieve goals (objectives).

slide-20
SLIDE 20

Key points ts

  • Clarify language and use it

consistently

  • Start with goals
  • Link intended effects with goals
  • Be prepared for unexpected effects
  • There is no ‘correct’ way of depicting a

logic model

slide-21
SLIDE 21
slide-22
SLIDE 22

Program ¡or ¡interven-on ¡goal ¡

Inputs/resources ¡

Outputs/ ¡Ac-vi-es ¡ Outcomes/impact ¡ ¡ Assump-ons ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡Context ¡or ¡condi-ons ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡external ¡factors ¡ Staff ¡ Volunteers ¡ Time ¡ Money ¡ Evidence ¡base ¡ Equipment ¡ Technology ¡ partners ¡ What ¡we ¡do: ¡ ¡ Workshops, ¡ mee-ngs, ¡ Services, ¡ Products, ¡ Training ¡, ¡ Resources, ¡ Facilitate, ¡ Partner, ¡ Who ¡we ¡ reach: ¡ ¡ ¡ Clients, ¡ Agencies, ¡ Decision ¡ makers, ¡ Service ¡ users, ¡

  • Etc. ¡

Short ¡

Learning ¡ ¡ Knowledge ¡ ¡ APtudes ¡ ¡ Skills ¡ ¡ Awareness ¡ ¡ Opinions ¡ ¡ Aspira-ons ¡ ¡

¡

¡

¡

  • Med. ¡

Behaviour, ¡ ¡ Prac-ces, ¡ ¡ Decision-­‑ ¡ making, ¡ ¡ Policies, ¡ ¡ Social ¡ ac-ons. ¡ ¡ Sustained ¡ knowledge ¡

Long ¡

Sustained ¡ behaviour ¡ change ¡ ¡ Social/ ¡ economic/ ¡ civic ¡ condi-ons ¡ ¡ ¡ Cultural ¡ change ¡

slide-23
SLIDE 23

When to to use a logic model

  • During planning
  • During program/project

implementation

  • During staff/stakeholder orientation
  • During evaluation
slide-24
SLIDE 24

Limita tati tions of logic modelling

  • Can be time consuming and do not

account for unintended consequences

  • Can become onerous – needs a

balance between complexity and over simplification

  • No guarantee of logic (or of success),

must be plausible & feasible

  • Must be a work in progress: a

continuous process of intentional review and revision.

slide-25
SLIDE 25

Benefits ts of a logic models

  • They integrate planning,

implementation and evaluation

  • Prevent mismatches between activities

and effects

  • Leverage the power of partnerships
  • Enhance accountability
slide-26
SLIDE 26

Ev Evaluati tion Resources

  • Community Toolbox:

http://ctb.ku.edu/en/table-of- contents/overview/model-for- community-change-and-improvement

  • Better Evaluation:

http://betterevaluation.org/

slide-27
SLIDE 27
slide-28
SLIDE 28

Pr Progr gram m go goal l

  • Goal: To build the capacity and skills
  • f students in year 10 to

– Identify behaviours associated with respectful relationships – Identify attitudes and behaviours that underpin and perpetuate gender inequality

slide-29
SLIDE 29

Tas Task k

  • Working in a group start to develop a

logic model to plan for program implementation and evaluation.

  • Feel free to change the goal, but don’t

spend too much time on this.

  • ‘what you will do’ represents the

actions, or objectives of the project:

– Should be related to achieving the goal – Outcomes should relate to the indicators for change.

slide-30
SLIDE 30

Proxy indicato tors

  • Community