program evaluation: you dont need a phd to do it! Sara Corwin, - - PowerPoint PPT Presentation

program evaluation you don t need a phd to do it
SMART_READER_LITE
LIVE PREVIEW

program evaluation: you dont need a phd to do it! Sara Corwin, - - PowerPoint PPT Presentation

program evaluation: you dont need a phd to do it! Sara Corwin, MPH, PhD University of South Carolina Arnold School of Public Health Department of Health Promotion, Education and Behavior 2 Its really not that effective, but its


slide-1
SLIDE 1

program evaluation: you don’t need a phd to do it!

Sara Corwin, MPH, PhD

University of South Carolina Arnold School of Public Health Department of Health Promotion, Education and Behavior

slide-2
SLIDE 2

2

“It’s really not that effective, but it’s easy to store.”

slide-3
SLIDE 3

3

session objectives

At the end of the session, participants will be able to: 1. define common program evaluation terms 2. explain the use of logic models in developing program evaluation strategies 3. clarify the differences between process, impact, and

  • utcome level evaluation levels

4. describe frequently used evaluation tools for each level of program evaluation 5. provide suggestions for low cost methods of program evaluation that can be implemented in their setting 6. identify ways to use evaluation information (results) to improve their health promotion, wellness, and disease management programs.

slide-4
SLIDE 4

4

reasons why you can do it!

  • gazillions of resources out there
  • you have resources at your finger tips
  • others have done it before
  • it isn’t rocket science, just being

systematic & organized

  • you are probably already doing it
  • its nothing more than gathering

information & making decisions

slide-5
SLIDE 5

5

why evaluate ?

slide-6
SLIDE 6

6

before planning an evaluation

a wellness program should have:

  • a clear mission/vision
  • a comprehensive, well-planned approach
  • reasonable, achievable goals & objectives

guided by a program LOGIC MODEL

  • resources (including personnel!) that

match

  • realistic time line
  • folks with a sense of humor!
slide-7
SLIDE 7

7

a logic what?

a LOGIC MODEL

  • describes main elements of intervention &

how they all work together to prevent/reduce

  • r mediate a health issue
  • displayed as flow chat, map, table
  • outlines sequential steps that lead to desired
  • utcomes
  • can contain inputs, throughputs, outputs
  • shows linkages
slide-8
SLIDE 8

July 20, 2006 31st Annual National Wellness Conference

8

a typical logic model

slide-9
SLIDE 9

9

a ‘real’ logic model

Identified risk and protective factors Change Employee Behavior(s) Improved Employee Health & Orgz’l Outcomes By conducting a defined set of activities in the worksite We will modify In order to Resulting in

slide-10
SLIDE 10

10

http://www.wkkf.org/pubs/tools/evaluation/pub3669.pdf

slide-11
SLIDE 11

11

awesome !

FREE online logic model design course: University Wisconsin – Extension http://www.uwex.edu/ces/lmcourse/

slide-12
SLIDE 12

12

why does it matter?

  • useful in building consensus among

stakeholders

  • clarifies “etiology” of the health condition

& related behavioral influences

  • helps identify programmatic activities &

evaluation strategies **

  • identifies priorities **
  • sometimes you are required to have one
  • first step in the planning process
slide-13
SLIDE 13

13

review: program planning

understand community & engage participants set goals &

  • bjectives

develop intervention implement assess needs evaluate

McKenzie J. and Smeltzer J. (1997). Planning, Implementing, and Evaluating Health Promotion Programs: A Primer, 2nd Edition Allyn and Bacon.

slide-14
SLIDE 14

14

getting on with it …

what is evaluation?

  • determining the value of what you’ve

done (WELCOA, 2005)

  • assigning value AND making judgments

about:

  • Merit (i.e., quality)
  • Worth (i.e., cost-effectiveness)
  • Significance (i.e., importance) (CDC, 2005)
slide-15
SLIDE 15

15

program evaluation levels

Process Impact Outcome

“Are we doing what we said we would?” Effectiveness measures Morbidity, mortality, employee outcomes How much, for whom, when, by whom? Change in K, A, B/Skills, org’l changes Improved health, wellness for employees QAR/CQI/QI (integrity) Must occur before

  • utcomes

Improved Q of L for employees

Summative Formative Qualitative Methods Quantitative Methods

slide-16
SLIDE 16

16

according to whom? what?

“acceptable standards of quality”:

  • utility – usefulness?
  • feasibility – can you really do it?
  • propriety – legal, ethical?
  • accuracy – valid, reliable?
  • your organization’s principles, priorities
  • others?

The Joint Committee on Standards for Educational Evaluation (1994). The Program Evaluation Standards. Thousand Oaks, CA: Sage Publications, Inc. http://www.wmich.edu/evalctr/jc/

slide-17
SLIDE 17

17

steps…

University of Wisconsin Extension. (June 2005). Documenting outcomes in tobacco control programs. University of Wisconsin Cooperative Extension, Program Development and Evaluation, Madison, WI. Retrieved online July 9, 2006 at http://www.uwex.edu/ces/pdande/

slide-18
SLIDE 18

18

  • r these steps …

Centers for Disease Control and Prevention. Framework for Program Evaluation in Public Health. MMWR 1999;48(No. RR-11). Retrieved online July 9, 2006 at http://www.cdc.gov/mmwr/preview/mmwrhtml/ rr4811a1.htm#fig1

slide-19
SLIDE 19

19

  • 3. collect information (“data”)

2 types:

  • primary – data that you collect
  • secondary – data that already exists

you’ll need:

  • baseline data – “before” information
  • follow-up data – “after” information
  • comparison group – non-participants
slide-20
SLIDE 20

20

an academic moment …

O1 X O2 NR O1 X O2

  • - - - - - - - - - - - - - - - - - - - - -

NR O1 X O2

R O1 X O2

R O1 O2

R O1 O2 X O3 O4

R O1 O2 O3 O4

slide-21
SLIDE 21

21

data sources

  • multiple stakeholder perspectives must be

assessed!

  • who are key & potential collaborating

program stakeholders that have information we need/want?

  • what do we want to know? (must be clear)
  • develop a data collection plan utilizing

formal & informal methods

slide-22
SLIDE 22

22

primary data collection

type of information (data) collected:

  • qualitative
  • quantitative

collect information (data) from:

  • individuals
  • groups
slide-23
SLIDE 23

23

qualitative

  • vs. quantitative
  • Text based
  • Narrow, deep information
  • Planning time “light”
  • Analysis time “heavy”
  • No statistical tests
  • Less generalizable
  • Growing acceptance
  • Numbers based
  • Broad, shallow information
  • Planning time “heavy”
  • Analysis time “light”
  • Statistical tests apply
  • More generalizable
  • Well accepted
slide-24
SLIDE 24

24

common individual level methods

  • “single step,” “cross sectional” surveys

(online, paper-pencil, email)

  • interviews:
  • mail
  • email
  • web
  • face-to-face
  • phone
  • Examples of what you have done?
slide-25
SLIDE 25

25

common group level methods

  • focus group
  • nominal group
  • community forum
  • participant observation
slide-26
SLIDE 26

26

how to decide?

depends upon:

  • type of information you’ll need
  • respondent characteristics
  • $$$$$ available
  • time frame
  • staff available & skill sets
  • work load issues
slide-27
SLIDE 27

27

what to evaluate?

tied to your plan’s goals & objectives !

  • K, A, B
  • “risk factors”
  • employee satisfaction with programs,

services, incentives

  • participation rates
  • costs
  • organizational culture
  • social, environmental context
slide-28
SLIDE 28

28

tools (“instruments”)

  • don’t re-invent the wheel!
  • adapt existing instruments/questions
  • keep your purpose/focus in mind
  • know when to use open- & close-ended

items

  • watch for biased, leading questions
  • know your respondents (culturally

sensitive, literacy levels, etc.)

slide-29
SLIDE 29

29

general data collection tips

  • before you start: obtain human subjects

protection (IRB, HIPPA, etc.)

  • get input from others
  • always pilot test (do a trial run)
  • use more than 1 method
  • combine qualitative & quantitative

approaches

slide-30
SLIDE 30

30

after collecting data, what next?

  • 4. Analysis (& Interpret)

Quantitative

  • keep focused on the purpose – what

questions are you answering?

  • keep your audience in mind – who will get

the information & in what format?

  • keep it simple
  • summary information, not sophisticated

statistics

slide-31
SLIDE 31

31

quantitative analysis

  • compile the data – get it all together

in 1 place

  • review surveys for incomplete or missing

answers (to use or not to use?)

  • hand tally or enter answers into a

computer?

  • Microsoft Excel, EpiInfo (free from CDC

http://www.cdc.gov/epo/epi/epiinfo.htm)

  • double check data entry or hand counts
slide-32
SLIDE 32

32

quantitative analysis

  • type of data “out” depends on type of

questions you ask

  • levels of measure determine how you “run”

the data entered in the computer

  • typically use descriptive statistics, i.e.,

count (frequencies), percents (%), averages (mean), SD, range

slide-33
SLIDE 33

33

same for …qual data

  • keep focused on the purpose – what

questions are you answering?

  • keep your audience in mind – who will get

the information & in what format?

  • usually conduct a “debrief” after the

session to discuss what occurred & identify initial results

slide-34
SLIDE 34

34

qualitative analysis

  • type notes or transcribe audio tapes

verbatim in Microsoft Word

  • print out documents to be sure names &

identifying information removed

  • hand “code” or “import” document files into

another computer program

  • numerous qual software programs out

there

slide-35
SLIDE 35

35

reporting results

  • decide on format (report, slides, fact

sheet, press release, presentation)

  • include sections with headings:
  • executive summary (write last)
  • background/introduction
  • methods/procedures
  • results/findings
  • conclusions/recommendations
  • use your initial questions to frame your

results (address the purpose)

slide-36
SLIDE 36

36

reporting results

  • use simple tables, figures, diagrams, bar

charts

  • focus on tying your results to program

planning & implementation … HERE’s the POINT: what does it all mean in terms of your program?

slide-37
SLIDE 37

37

  • 5. utilize the results
  • give “need to know” findings to stakeholders

(dissemination) as soon as possible

  • keep confidentiality in the forefront (use

sensitivity & avoid judgments/labels about people)

  • seek ways to use the information to tailor &

adapt intervention activities

  • combine your results with multiple, “mixed”

methods

slide-38
SLIDE 38

38

common challenges & solutions

  • the primary & secondary data don’t “jive”
  • administration/board/stakeholders don’t “see it”

the way you, your team, and/or your participants “see it”

  • creating a “perceived need” within your

intended participant population

  • lack of resources, skill areas to “pull it off”
  • getting “bogged down” in the data collection,

analysis, reporting process

slide-39
SLIDE 39

39

  • many interest groups often involved
  • evaluation is political!
  • hidden agendas?
  • organizations are complex
  • resource constraints
  • sustainability issues
  • volunteer staff

“real world” potential barriers

slide-40
SLIDE 40

40

goes back to…

  • a good, realistic plan
  • with the right stakeholders
  • a champion (in a position of power)
  • dedicated resources
  • understanding that it takes time and,
  • human health behavior is complex!
slide-41
SLIDE 41

41

where to get resources

  • reputable websites
  • qualified stakeholders
  • qualified community members
  • universities & colleges
  • national centers &
  • rganizations
  • private consultants ($$$)
slide-42
SLIDE 42

42

resources

Polacsek M, O’Brien LM, Lagasse W, Hammar N. Move & Improve: a worksite wellness program in

  • Maine. Prev Chronic Dis [serial online] 2006 Jul [date cited]. Available from: URL:

http://www.cdc.gov/pcd/issues/2006/ul/05_0123.htm. http://www.arkansas.gov/ha/worksite_wellness/index.html Chapman, L.S., Planning Wellness Getting Off to a Good Start. Seattle, WA; Summex Corporation, 1996. Green, L.W., and Kreuter, M.W., Health Promotion Planning, An Educational and Environmental Approach, (2nd ed.). Mountain View, CA; Mayfield Publishing Company, 1991. Hunnicutt, D., Deming, A., and Baun, B., Health Promotion Sourcebook for Small

  • Business. Wellness Councils of America, 1998.

McGinnis, J.M., Worksite Health Promotion Activities Summary Report, U.S. Department of Health and Human Services. Office of Disease Prevention and Health Promotion, 1992. Chapman, Larry S. Program Evaluation: A Key to Wellness Program Survival, 1996 McKenzie J. and Smeltzer J. Planning, Implementing, and Evaluating Health Promotion Programs: A Primer, 2nd Edition Allyn and Bacon, 1997 Schaloc, R. Outcome-Based Evaluation Plenum Press, 1995 Parsons, B.A. ( 1999, November) Making Logic Models More Systemic: An Activity Paper prepared for the 1999 American Evaluation Association annual meeting, Orlando, FL.

slide-43
SLIDE 43

43

resources

Health Promotion: Sourcebook for Small Businesses, published by the Wellness Councils of America and Canada. Call (402) 827-3590 to order. Schmitz, C. & Parsons, B.A. (1999) Everything You Ever Wanted to Know about Logic Models But Were Afraid to Ask.Paper funded by the W.K. Kellogg Foundation (WKKF) under a contract to InSites, a Colorado-based non-profit 501(c)3 organization. Bamberger, M., Rugh, J., Mabry, L. (2005). RealWorld Evaluation: Conducting Evaluations With Budget, Time, Data and Political Constraints. Newbury Park, CA: SAGE Publications, Inc. The Joint Committee on Standards for Educational Evaluation (1994). The Program Evaluation Standards. Thousand Oaks, CA: Sage Publications, Inc. http://www.wmich.edu/evalctr/jc/ University of Wisconsin Extension. (June 2005). Documenting outcomes in tobacco control programs. University of Wisconsin Cooperative Extension, Program Development and Evaluation, Madison, WI. Retrieved online July 9, 2006 at http://www.uwex.edu/ces/pdande/ Centers for Disease Control and Prevention. Framework for Program Evaluation in Public Health. MMWR 1999;48(No. RR-11). Retrieved online July 9, 2006 at http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm#fig1

slide-44
SLIDE 44

44

Contact Me

Sara J. Corwin, MPH, PhD Clinical Associate Professor Department of Health Promotion, Education and Behavior Arnold School of Public Health University of South Carolina 800 Sumter Street, #216E Columbia, SC 29208 803.777.3636 (voice) corwins@sc.edu

slide-45
SLIDE 45

July 20, 2006 31st Annual National Wellness Conference

45

Hey … I feel better already!