program evaluation you don t need a phd to do it
play

program evaluation: you dont need a phd to do it! Sara Corwin, - PowerPoint PPT Presentation

program evaluation: you dont need a phd to do it! Sara Corwin, MPH, PhD University of South Carolina Arnold School of Public Health Department of Health Promotion, Education and Behavior 2 Its really not that effective, but its


  1. program evaluation: you don’t need a phd to do it! Sara Corwin, MPH, PhD University of South Carolina Arnold School of Public Health Department of Health Promotion, Education and Behavior

  2. 2 “It’s really not that effective, but it’s easy to store.”

  3. 3 session objectives At the end of the session, participants will be able to: 1. define common program evaluation terms 2. explain the use of logic models in developing program evaluation strategies 3. clarify the differences between process, impact, and outcome level evaluation levels 4. describe frequently used evaluation tools for each level of program evaluation 5. provide suggestions for low cost methods of program evaluation that can be implemented in their setting 6. identify ways to use evaluation information (results) to improve their health promotion, wellness, and disease management programs.

  4. 4 reasons why you can do it! • gazillions of resources out there • you have resources at your finger tips • others have done it before • it isn’t rocket science, just being systematic & organized • you are probably already doing it • its nothing more than gathering information & making decisions

  5. 5 why evaluate ?

  6. 6 before planning an evaluation a wellness program should have: • a clear mission/vision • a comprehensive, well-planned approach • reasonable, achievable goals & objectives guided by a program LOGIC MODEL • resources (including personnel!) that match • realistic time line • folks with a sense of humor!

  7. 7 a logic what? a LOGIC MODEL • describes main elements of intervention & how they all work together to prevent/reduce or mediate a health issue • displayed as flow chat, map, table • outlines sequential steps that lead to desired outcomes • can contain inputs, throughputs, outputs • shows linkages

  8. 8 a typical logic model 31 st Annual National July 20, 2006 Wellness Conference

  9. 9 a ‘real’ logic model By Improved Identified conducting a We will Resulting In order Employee risk Change defined set of modify in to Health & and Employee activities in Orgz’l protective Behavior(s) the Outcomes factors worksite

  10. 10 http://www.wkkf.org/pubs/tools/evaluation/pub3669.pdf

  11. 11 awesome ! FREE online logic model design course: University Wisconsin – Extension http://www.uwex.edu/ces/lmcourse/

  12. 12 why does it matter? • useful in building consensus among stakeholders • clarifies “etiology” of the health condition & related behavioral influences • helps identify programmatic activities & evaluation strategies ** • identifies priorities ** • sometimes you are required to have one • first step in the planning process

  13. 13 review: program planning understand assess set goals develop implement evaluate community & needs & intervention engage objectives participants McKenzie J. and Smeltzer J. (1997). Planning, Implementing, and Evaluating Health Promotion Programs: A Primer, 2nd Edition Allyn and Bacon.

  14. 14 getting on with it … what is evaluation? • determining the value of what you’ve done (WELCOA, 2005) • assigning value AND making judgments about: • Merit (i.e., quality) • Worth (i.e., cost-effectiveness) • Significance (i.e., importance) (CDC, 2005)

  15. 15 program evaluation levels Formative Summative Process Impact Outcome “Are we doing what Effectiveness Morbidity, mortality, we said we would?” employee outcomes measures Improved health, How much, for Change in K, A, wellness for whom, when, by B/Skills, org’l employees whom? changes QAR/CQI/QI Must occur before Improved Q of L for outcomes employees (integrity) Qualitative Methods Quantitative Methods

  16. 16 according to whom? what? “ acceptable standards of quality”: • utility – usefulness? • feasibility – can you really do it? • propriety – legal, ethical? • accuracy – valid, reliable? • your organization’s principles, priorities • others? The Joint Committee on Standards for Educational Evaluation (1994). The Program Evaluation Standards. Thousand Oaks, CA: Sage Publications, Inc. http://www.wmich.edu/evalctr/jc/

  17. 17 steps… University of Wisconsin Extension. (June 2005). Documenting outcomes in tobacco control programs. University of Wisconsin Cooperative Extension, Program Development and Evaluation, Madison, WI. Retrieved online July 9, 2006 at http://www.uwex.edu/ces/pdande/

  18. 18 or these steps … Centers for Disease Control and Prevention. Framework for Program Evaluation in Public Health. MMWR 1999;48(No. RR-11). Retrieved online July 9, 2006 at http://www.cdc.gov/mmwr/preview/mmwrhtml/ rr4811a1.htm#fig1

  19. 19 3. collect information (“data”) 2 types: • primary – data that you collect • secondary – data that already exists you’ll need: • baseline data – “before” information • follow-up data – “after” information • comparison group – non-participants

  20. 20 an academic moment … O 1 X O 2 NR O 1 X O 2 - - - - - - - - - - - - - - - - - - - - - - NR O 1 X O 2 R O 1 X O 2 R O 1 O 2 R O 1 O 2 X O 3 O 4 R O 1 O 2 O 3 O 4

  21. 21 data sources • multiple stakeholder perspectives must be assessed! • who are key & potential collaborating program stakeholders that have information we need/want? • what do we want to know? (must be clear) • develop a data collection plan utilizing formal & informal methods

  22. 22 primary data collection type of information (data) collected: • qualitative • quantitative collect information (data) from: • individuals • groups

  23. 23 qualitative vs. quantitative • Text based • Numbers based • Narrow, deep information • Broad, shallow information • Planning time “light” • Planning time “heavy” • Analysis time “heavy” • Analysis time “light” • No statistical tests • Statistical tests apply • Less generalizable • More generalizable • Growing acceptance • Well accepted

  24. 24 common individual level methods • “single step,” “cross sectional” surveys (online, paper-pencil, email) • interviews: • mail • email • web • face-to-face • phone • Examples of what you have done?

  25. 25 common group level methods • focus group • nominal group • community forum • participant observation

  26. 26 how to decide? depends upon: • type of information you’ll need • respondent characteristics • $$$$$ available • time frame • staff available & skill sets • work load issues

  27. 27 what to evaluate? tied to your plan’s goals & objectives ! • K, A, B • “risk factors” • employee satisfaction with programs, services, incentives • participation rates • costs • organizational culture • social, environmental context

  28. 28 tools (“instruments”) • don’t re-invent the wheel! • adapt existing instruments/questions • keep your purpose/focus in mind • know when to use open- & close-ended items • watch for biased, leading questions • know your respondents (culturally sensitive, literacy levels, etc.)

  29. 29 general data collection tips • before you start: obtain human subjects protection (IRB, HIPPA, etc.) • get input from others • always pilot test (do a trial run) • use more than 1 method • combine qualitative & quantitative approaches

  30. 30 after collecting data, what next? 4. Analysis (& Interpret) Quantitative • keep focused on the purpose – what questions are you answering? • keep your audience in mind – who will get the information & in what format? • keep it simple • summary information, not sophisticated statistics

  31. 31 quantitative analysis • compile the data – get it all together in 1 place • review surveys for incomplete or missing answers (to use or not to use?) • hand tally or enter answers into a computer? • Microsoft Excel, EpiInfo (free from CDC http://www.cdc.gov/epo/epi/epiinfo.htm) • double check data entry or hand counts

  32. 32 quantitative analysis • type of data “out” depends on type of questions you ask • levels of measure determine how you “run” the data entered in the computer • typically use descriptive statistics, i.e., count (frequencies), percents (%), averages (mean), SD, range

  33. 33 same for …qual data • keep focused on the purpose – what questions are you answering? • keep your audience in mind – who will get the information & in what format? • usually conduct a “debrief” after the session to discuss what occurred & identify initial results

  34. 34 qualitative analysis • type notes or transcribe audio tapes verbatim in Microsoft Word • print out documents to be sure names & identifying information removed • hand “code” or “import” document files into another computer program • numerous qual software programs out there

  35. 35 reporting results • decide on format (report, slides, fact sheet, press release, presentation) • include sections with headings: • executive summary (write last) • background/introduction • methods/procedures • results/findings • conclusions/recommendations • use your initial questions to frame your results (address the purpose)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend