organizations
play

ORGANIZATIONS 1 H A R R Y P . H A T R Y A N D M A R Y K . W I - PowerPoint PPT Presentation

EVALUATING NONPROFIT ORGANIZATIONS 1 H A R R Y P . H A T R Y A N D M A R Y K . W I N K L E R U R B A N I N S T I T U T E J U L Y 1 2 , 2 0 1 2 F O R B U S I N E S S C I V I C L E A D E R S H I P C E N T E R U . S . C H A M B E R


  1. EVALUATING NONPROFIT ORGANIZATIONS 1 H A R R Y P . H A T R Y A N D M A R Y K . W I N K L E R U R B A N I N S T I T U T E J U L Y 1 2 , 2 0 1 2 F O R B U S I N E S S C I V I C L E A D E R S H I P C E N T E R U . S . C H A M B E R O F C O M M E R C E

  2. Presenters 2 Harry P. Hatry, Senior Fellow Urban Institute Mary K. Winkler Senior Research Associate Center for Nonprofits and Philanthropy, Urban Institute

  3. Agenda 3 What corporate sponsors should do relating to 1. evaluation before it enters into an agreement with a grantee. What corporate sponsors should do and expect relating 2. to evaluation during and afterward. Questions — Discussion. 3.

  4. Advance Considerations 4  Evaluation Goals and Expectations  Monitoring for Accountability versus Shared Learning  How information will be used  Structure of the Funder-Grantee Relationship  Collaborative or Autonomous  Types and Level of Support to Offer

  5. Emerging Approaches Offer Guidance 5  ―Charting Impact‖ – Independent Sector  ―Charity Navigator 2.0‖  Hewlett Foundation

  6. Charting Impact (Independent Sector) 6 A standardized framework for describing the work of a nonprofit based on five questions: What is your organization aiming to accomplish? 1. What are your strategies for making this happen? 2. What are your organizational capabilities for doing 3. this? How will your organization know if you are making 4. progress? What have and haven’t you accomplished so far? 5.

  7. Charity Navigator 2.0: Seven Questions to Ask Charities Before Donating 7 What is the charity’s commitment to reporting results? 1. How does the charity demonstrate the demand for 2. services? Does the charity report its activities (what it does)? 3. Does the charity report its outputs (immediate results)? 4. Does the charity report its outcomes (medium- and 5. longer-term results)? What is the quality of evidence for reported results? 6. Does the charity adjust and improve in light of its results? 7. These questions were developed in partnership with New Philanthropy Capital and Keystone Accountability. http://www.charitynavigator.org/index.cfm?bay=content.view&cpid=1209

  8. Hewlett Foundation Evaluation Chart 8 Indicator Baseline Target Result (year) Where do you hope it What actually happened? What are your plans will be at the end of the What is the (To fill out in a future and their intended What will you measure? grant? (Or whatever status quo? report; leave blank during impact? timeline is proposal) appropriate.) OUTCOMES # of participants who enroll in: (a) Increased placement in community college, (b) 40 90 postsecondary apprenticeships, or (c) education other postsecondary programs. Improved # participants who are employment employed within 3 20 50 months of program completion. ACTIVITIES Identify and enroll # of participants participants in adult enrolled in adult 80 160 education services. education services. Hire additional # new instructors hired. instructors. n/a 4

  9. Evaluation Specifics 9 The Nitty Gritty of Evaluation for Funders

  10. Two Major Evaluation Approaches 10 • An in-depth evaluation or • An ―evaluation‖ based on data from the grantee’s on - going performance measurement process.

  11. Comparison if the Two Evaluation Approaches 11 IN-DEPTH PERFORMANCE EVAL UATIONS MEASUREMENT 1. Depth of Information: Can identify reasons for Only tells ―the score‖, not poor performance WHY 2. Timeliness of Findings: Can require months and Relatively quickly even years 3. Skills Required: Special skills needed Considerably less needed 4. Quality of Analysis: Strong Usually limited and weak 5. Cost: High Low 6. Utility: Major program decisions ―Minor‖ program decisions; Program Improvement

  12. What Kinds of Information Should be Provided Initially to Funders? 12 Ask the party responsible for the evaluation to provide before the start of the evaluation: 1. Statement of Purpose/Mission/Objective of the Program 2. A ―logic model‖ (―outcome sequence chart‖) 3. Specific indicators that will be used to evaluate the program’s success— with the source of the data for each indicator 4. A description of its proposed evaluation process

  13. Example of a Logic Model 13 Sessions Smokers Complete Publicized Enroll The Program # Announcements # Enrolled # Completing Improved Stopped Long-Term Smoking Health # with # Who Smoking-related Stopped Smoking Illnesses

  14. Example : Health Risk Reduction Program 14 Health Risk Reduction Program Description To promote health and quality of life through programs that help clients reduce health risk behaviors through health education and preventive programs. Examples include health promotion, STD-HIV prevention, stress management, substance abuse prevention and treatment (alcohol and drug), smoking cessation, weight loss, wellness, and nutrition programs. Intermediate Outcomes End Outcomes Clients demonstrate Increased Clients Individuals Clients Clients knowledge about Clients Clients demonstrate enroll in attend complete risk behaviors experience experience improved program program program decreased improved attitude health risk health towards Clients behaviors changing demonstrate behavior increased knowledge about how to reduce risk behaviors

  15. Example of Outcome Indicators: Emergency Response 15  Number of hours/days before the service was provided. (Quick response)  Percent of persons affected that reported having had major difficulties communicating with service providers.  Percent of persons affected reporting they sought the needed help and received that help.

  16. Data Analysis 16  Often badly neglected in performance measurement systems.  Usually analysis is considerably better in in-depth evaluations. (Of course, you are probably paying for the in-depth evaluation.)  The next chart provides an example.

  17. Which Hospital Would You Choose? 3% 2% 2,100 800 63 16 DEATH DEATH SURGERY SURGERY DEATHS DEATHS RATE RATE PATIENTS PATIENTS MERCY APOLLO HOSPITAL HOSPITAL 1. 3% 1% 600 600 6 8 DEATH DEATH IN GOOD IN GOOD DEATHS DEATHS RATE RATE CONDITION CONDITION 4% 200 1,500 3.8% 57 8 DEATH IN POOR IN POOR DEATH DEATHS DEATHS RATE CONDITION CONDITION RATE 17

  18. Common Evaluation Difficulties 18  Some outcomes are very hard to measure, e.g., prevention success. Don’t overdo your expectations.  Some key outcome indicators require obtaining information from customers after the service to the customer has ended. (Grantees are reluctant to undertake such surveys.)  Grantees usually have very limited capacity to do evaluations.  Determining what caused the outcomes can be very complicated.

  19. Cost of Evaluations 19  In-depth program evaluations undertaken by contractors can be expensive and likely will cost anywhere from $50,000 and up. Use primarily for major large programs.  Performance measurement driven evaluations are likely to be considerably less costly, cost can rise if new outcome-data needs to be obtained, such as feedback from former customers.  When surveys of the program’s former customers are needed, if grantee staff can be used to do the follow-ups, the process can be low cost. Cost will then depend on the number of persons to be followed up and the difficulty in contacting them.

  20. Funder Responsibilities 20 1. Encourage grantees to focus on results. 2. Provide funds for added evaluation costs. 3. Encourage getting input from customers when deciding on the outcomes to be evaluated by the program. 4. Reward grantees whose programs have been effective. 5. Help build grantee capacity to maintain, and use, their own evaluation process to improve their services.

  21. Resources 21  The Outcome Indicators Project - http://www.urban.org/center/cnp/Projects/outcomeindicators.cfm  ―Leap of Reason‖ - http://www.vppartners.org/leapofreason/overview  ―Outcome Management for NPOs‖ series -- www.urban.org  Perform Well – www.performwell.org  ―Performance Measurement: Getting Results,‖ Hatry, second edition, 2006.  ―Evaluation Matters: Lessons from Youth Serving Organizations‖ -- http://www.urban.org/publications/411961.html  ―Designing Evaluations: 2012 Revision,‖ U.S. Government Accountability Office, January 2012.  ―Handbook of Practical Evaluation,‖ Wholey, Hatry, Newcomer, editors, Jossey Bass, third edition, 2010.

  22. For Further Information 22 hhatry@urban.org and mwinkler@urban.org

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend