ORGANIZATIONS 1 H A R R Y P . H A T R Y A N D M A R Y K . W I - - PowerPoint PPT Presentation

organizations
SMART_READER_LITE
LIVE PREVIEW

ORGANIZATIONS 1 H A R R Y P . H A T R Y A N D M A R Y K . W I - - PowerPoint PPT Presentation

EVALUATING NONPROFIT ORGANIZATIONS 1 H A R R Y P . H A T R Y A N D M A R Y K . W I N K L E R U R B A N I N S T I T U T E J U L Y 1 2 , 2 0 1 2 F O R B U S I N E S S C I V I C L E A D E R S H I P C E N T E R U . S . C H A M B E R


slide-1
SLIDE 1

H A R R Y P . H A T R Y A N D M A R Y K . W I N K L E R U R B A N I N S T I T U T E J U L Y 1 2 , 2 0 1 2 F O R B U S I N E S S C I V I C L E A D E R S H I P C E N T E R U . S . C H A M B E R O F C O M M E R C E

EVALUATING NONPROFIT ORGANIZATIONS

1

slide-2
SLIDE 2

Presenters

Harry P. Hatry, Senior Fellow Urban Institute Mary K. Winkler Senior Research Associate Center for Nonprofits and Philanthropy, Urban Institute

2

slide-3
SLIDE 3

Agenda

1.

What corporate sponsors should do relating to evaluation before it enters into an agreement with a grantee.

2.

What corporate sponsors should do and expect relating to evaluation during and afterward.

3.

Questions—Discussion.

3

slide-4
SLIDE 4

Advance Considerations

 Evaluation Goals and Expectations

 Monitoring for Accountability versus Shared Learning  How information will be used

 Structure of the Funder-Grantee Relationship

 Collaborative or Autonomous  Types and Level of Support to Offer

4

slide-5
SLIDE 5

Emerging Approaches Offer Guidance

 ―Charting Impact‖ – Independent Sector  ―Charity Navigator 2.0‖  Hewlett Foundation

5

slide-6
SLIDE 6

Charting Impact (Independent Sector)

A standardized framework for describing the work of a nonprofit based on five questions:

1.

What is your organization aiming to accomplish?

2.

What are your strategies for making this happen?

3.

What are your organizational capabilities for doing this?

4.

How will your organization know if you are making progress?

5.

What have and haven’t you accomplished so far?

6

slide-7
SLIDE 7

Charity Navigator 2.0: Seven Questions to Ask Charities Before Donating

1.

What is the charity’s commitment to reporting results?

2.

How does the charity demonstrate the demand for services?

3.

Does the charity report its activities (what it does)?

4.

Does the charity report its outputs (immediate results)?

5.

Does the charity report its outcomes (medium- and longer-term results)?

6.

What is the quality of evidence for reported results?

7.

Does the charity adjust and improve in light of its results?

These questions were developed in partnership with New Philanthropy Capital and Keystone Accountability. http://www.charitynavigator.org/index.cfm?bay=content.view&cpid=1209

7

slide-8
SLIDE 8

Hewlett Foundation Evaluation Chart

Indicator Baseline Target Result (year)

What are your plans and their intended impact? What will you measure? What is the status quo? Where do you hope it will be at the end of the grant? (Or whatever timeline is appropriate.) What actually happened? (To fill out in a future report; leave blank during proposal)

OUTCOMES

Increased placement in postsecondary education

# of participants who enroll in: (a) community college, (b) apprenticeships, or (c)

  • ther postsecondary

programs. 40 90

Improved employment

# participants who are employed within 3 months of program completion. 20 50

ACTIVITIES

Identify and enroll participants in adult education services. # of participants enrolled in adult education services. 80 160

Hire additional instructors.

# new instructors hired. n/a 4

8

slide-9
SLIDE 9

Evaluation Specifics

The Nitty Gritty of Evaluation for Funders

9

slide-10
SLIDE 10

Two Major Evaluation Approaches

  • An in-depth evaluation
  • r
  • An ―evaluation‖ based on data from the grantee’s on-

going performance measurement process.

10

slide-11
SLIDE 11

11

Comparison if the Two Evaluation Approaches

IN-DEPTH EVAL UATIONS PERFORMANCE MEASUREMENT

  • 1. Depth of Information:

Can identify reasons for poor performance Only tells ―the score‖, not WHY

  • 2. Timeliness of Findings:

Can require months and even years Relatively quickly

  • 3. Skills Required:

Special skills needed Considerably less needed

  • 4. Quality of Analysis:

Strong Usually limited and weak

  • 5. Cost:

High Low

  • 6. Utility:

Major program decisions ―Minor‖ program decisions; Program Improvement

slide-12
SLIDE 12

What Kinds of Information Should be Provided Initially to Funders?

Ask the party responsible for the evaluation to provide before the start of the evaluation:

  • 1. Statement of Purpose/Mission/Objective of the Program
  • 2. A ―logic model‖ (―outcome sequence chart‖)
  • 3. Specific indicators that will be used to evaluate the

program’s success—with the source of the data for each indicator

  • 4. A description of its proposed evaluation process

12

slide-13
SLIDE 13

Example of a Logic Model

Sessions Publicized # Announcements Smokers Enroll # Enrolled Complete The Program # Completing Stopped Smoking # Who Stopped Smoking Improved Long-Term Health # with Smoking-related Illnesses

13

slide-14
SLIDE 14

Example : Health Risk Reduction Program

14

Health Risk Reduction Program Description To promote health and quality of life through programs that help clients reduce health risk behaviors through health education and preventive programs. Examples include health promotion, STD-HIV prevention, stress management, substance abuse prevention and treatment (alcohol and drug), smoking cessation, weight loss, wellness, and nutrition programs.

Intermediate Outcomes End Outcomes

Clients experience decreased health risk behaviors Clients experience improved health Individuals enroll in program Clients attend program Clients complete program Clients demonstrate improved attitude towards changing behavior Clients demonstrate Increased knowledge about risk behaviors Clients demonstrate increased knowledge about how to reduce risk behaviors

slide-15
SLIDE 15

Example of Outcome Indicators: Emergency Response

15

 Number of hours/days before the service was provided.

(Quick response)

 Percent of persons affected that reported having had

major difficulties communicating with service providers.

 Percent of persons affected reporting they sought the

needed help and received that help.

slide-16
SLIDE 16

Data Analysis

 Often badly neglected in performance measurement

systems.

 Usually analysis is considerably better in in-depth

  • evaluations. (Of course, you are probably paying for

the in-depth evaluation.)

 The next chart provides an example.

16

slide-17
SLIDE 17

Which Hospital Would You Choose?

MERCY HOSPITAL APOLLO HOSPITAL

2,100

SURGERY PATIENTS

63

DEATHS

3%

DEATH RATE

800

SURGERY PATIENTS

16

DEATHS

2%

DEATH RATE

600

IN GOOD CONDITION

1,500

IN POOR CONDITION

6

DEATHS

57

DEATHS

1%

DEATH RATE

3.8%

DEATH RATE

600

IN GOOD CONDITION

200

IN POOR CONDITION

8

DEATHS

8

DEATHS

  • 1. 3%

DEATH RATE

4%

DEATH RATE

17

slide-18
SLIDE 18

Common Evaluation Difficulties

18

 Some outcomes are very hard to measure, e.g., prevention

  • success. Don’t overdo your expectations.

 Some key outcome indicators require obtaining information from customers after the service to the customer has ended. (Grantees are reluctant to undertake such surveys.)  Grantees usually have very limited capacity to do evaluations.  Determining what caused the outcomes can be very complicated.

slide-19
SLIDE 19

Cost of Evaluations

19

 In-depth program evaluations undertaken by contractors can

be expensive and likely will cost anywhere from $50,000 and

  • up. Use primarily for major large programs.

 Performance measurement driven evaluations are likely to be

considerably less costly, cost can rise if new outcome-data needs to be obtained, such as feedback from former customers.

 When surveys of the program’s former customers are needed,

if grantee staff can be used to do the follow-ups, the process can be

low cost. Cost will then depend on the number of persons to be followed up and the difficulty in contacting them.

slide-20
SLIDE 20

Funder Responsibilities

20

  • 1. Encourage grantees to focus on results.
  • 2. Provide funds for added evaluation costs.
  • 3. Encourage getting input from customers when deciding
  • n the outcomes to be evaluated by the program.
  • 4. Reward grantees whose programs have been effective.
  • 5. Help build grantee capacity to maintain, and use, their
  • wn evaluation process to improve their services.
slide-21
SLIDE 21

Resources

21

 The Outcome Indicators Project -

http://www.urban.org/center/cnp/Projects/outcomeindicators.cfm

 ―Leap of Reason‖ - http://www.vppartners.org/leapofreason/overview  ―Outcome Management for NPOs‖ series -- www.urban.org  Perform Well – www.performwell.org  ―Performance Measurement: Getting Results,‖ Hatry, second edition,

2006.

 ―Evaluation Matters: Lessons from Youth Serving Organizations‖ --

http://www.urban.org/publications/411961.html

 ―Designing Evaluations: 2012 Revision,‖ U.S. Government

Accountability Office, January 2012.

 ―Handbook of Practical Evaluation,‖ Wholey, Hatry, Newcomer,

editors, Jossey Bass, third edition, 2010.

slide-22
SLIDE 22

For Further Information

hhatry@urban.org and mwinkler@urban.org

22