Pro Bono Design & Management Accelerator 1 Decem ecember er - - PowerPoint PPT Presentation

pro bono design management accelerator
SMART_READER_LITE
LIVE PREVIEW

Pro Bono Design & Management Accelerator 1 Decem ecember er - - PowerPoint PPT Presentation

Pro Bono Design & Management Accelerator 1 Decem ecember er 12, 2 2018 2 Session 3 Impact Evaluation & Data Tracking 3 Coach introductions Peter James Senior Manager of Research & Evaluation Rene J. Schomp Senior Staff


slide-1
SLIDE 1

Pro Bono Design & Management Accelerator

Decem ecember er 12, 2 2018

1

slide-2
SLIDE 2

2

slide-3
SLIDE 3

Session 3

Impact Evaluation & Data Tracking

3

slide-4
SLIDE 4

Coach introductions

4

Renée J. Schomp Senior Staff Attorney, Pro Bono Consulting Peter James Senior Manager of Research & Evaluation

slide-5
SLIDE 5

Logistics - Nuts and bolts

  • Thank you to DREDF & Ed Roberts Campus!
  • Restrooms
  • Water
  • Lunch
  • Snacks

5

slide-6
SLIDE 6

Mindfulness moment

6

slide-7
SLIDE 7

Pro bono accelerator

  • bjectives
  • 1. Shared pro bono language
  • 2. Inspiration from peers
  • 3. Role of pro bono in larger civil justice

movement

  • 4. Lens of equity & inclusion
  • 5. Support on concrete action steps towards
  • rganizational change & pro bono design

7

slide-8
SLIDE 8

Pro bono accelerator roadmap

  • 1. October 10: Volunteerism Overview
  • 2. November 14: Recruitment, Cultivation, & Training
  • 3. December 12: Impact Evaluation & Data Tracking
  • 4. January 9: Placement, Supervision, & Technical Assistance
  • 5. February 13: Capstone Project Presentations &

Organizational Change Planning

8

slide-9
SLIDE 9

Today’s agenda ...

  • 1. Morning: the theory and the methods
  • a. Evaluation frameworks
  • b. Quantitative methods
  • c. Qualitative methods
  • 2. Afternoon: the realities
  • a. Doing evaluation in practice
  • b. Working with funders
  • 3. Capstone work time

9

slide-10
SLIDE 10

Grounding pro bono programs in a larger civil justice movement

10

Who benefits from evaluation and data tracking?

slide-11
SLIDE 11

Ground rules

  • Beach ball conversations
  • One diva, one mic
  • Make space, take space
  • Be here now
  • Confidentiality

11

slide-12
SLIDE 12

Introduction to Impact Evaluation and Data Tracking

12

slide-13
SLIDE 13

A simple example

13

slide-14
SLIDE 14

Insights from data

14

Understanding -- usually a bit late, but sometimes really late!

slide-15
SLIDE 15

Why evaluate le legal a l aid?

15

slide-16
SLIDE 16

Why evaluate pro b ro bon

  • no p
  • prog

rogra rams?

16

slide-17
SLIDE 17

Evaluation Frameworks

17

  • Evaluation frameworks help ensure that your

evaluation is answering relevant questions

slide-18
SLIDE 18

Defining your problem

18

  • Define and understand the problem that your
  • rg/program is trying to solve
slide-19
SLIDE 19

Theory of change

19

  • Explains how your org/program/new initiative

solves problem and achieves goals

  • Think about comparison and counterfactuals:

how does it bring about a change that otherwise would not happen? We believe that more people will file successfully with the help of a pro bono attorney (than without) because the applications are complex and daunting for most people without legal training.

slide-20
SLIDE 20

Logic models

20

  • A step-by-step diagram showing how your

program will achieve its results in the real world

Inputs Outputs Outcomes Impact Staff time Pro bono services Completed applications Improved status

slide-21
SLIDE 21

Focusing evaluation

21

  • For detailed evaluation, you can typically only

focus on parts of the theory or logic model

  • Choose and define your focus area
slide-22
SLIDE 22

Focus area: process evaluation

  • Sometimes we are most interested in evaluating

a specific process (rather than full program)

  • In this context, the process measures become

the outcomes/impacts of interest

slide-23
SLIDE 23

Focus area: service evaluation

  • We are often interested in one or more phases of

service provision

slide-24
SLIDE 24

Focus area: training evaluation

  • The Kirkpatrick Model: 4 Levels
  • Useful for defining outcomes and impacts
slide-25
SLIDE 25

Creating questions

25

  • Ultimately, all evaluation activity needs to be

grounded in a question (or set of questions) What proportion of clinic participants successfully file?

slide-26
SLIDE 26

Today’s exercise: Part 1

26

Introduction

  • Problem + goal
  • Theory & logic model
  • Evaluation question
slide-27
SLIDE 27

The Evaluation Toolbox:

  • verview of mixed methods

27

slide-28
SLIDE 28

Introduction to mixed methods

28

  • Actively choose your method(s)
  • Consider using multiple methods
  • Tailor the methods to question (esp whether

you are interested in causation)

slide-29
SLIDE 29

Quantitative methods

29

  • For things that you can count or measure
  • Use to either explore or to confirm theories
  • Examples for legal aid/pro bono:
  • What proportion of clients in our pro bono

program achieve a successful outcome?

  • Did our new retention strategy result in

increased retention of pro bono volunteers?

slide-30
SLIDE 30

Selecting a measure for your question

30

  • Two broad types of quantitative measure that

may be relevant to your evaluation question

  • Categorical variables
  • e.g. win/lose
  • Counts/totals and

proportions

  • Numerical variables
  • e.g. # cases handled
  • Descriptive statistics

(mean, median etc)

slide-31
SLIDE 31

Finding a data source

31

  • Internal administrative data - case management

system and data tracking tools

  • External administrative data - matching to

records held by courts, firms etc

  • Survey data - data provided by clients/pro

bonos

  • Think about:
  • Appropriateness
  • Completeness (often a limitation of surveys)
  • Accuracy
slide-32
SLIDE 32

Observational frameworks

32 Caseload Caseload

Post-intervention

  • utcomes unknown

Post-intervention

  • utcomes known
slide-33
SLIDE 33

Limitation of observational frameworks

33 Caseload (- help)

Counterfactual unknown

33 Caseload

Post-intervention

  • utcomes known
slide-34
SLIDE 34

Causal frameworks

34 Caseload (- help) Caseload

Outcomes without intervention Outcomes with intervention

slide-35
SLIDE 35

Pragmatic approaches

35

  • Causal framework is gold standard but

experiments often require professional assistance

  • Observational data is still valuable:
  • Understand what is happening
  • Develop new questions and theories
  • You can also try to approximate causal methods

by using comparisons:

  • Comparing groups
  • Change over time
slide-36
SLIDE 36

Example of observational study

36

Question How effective are limited scope services in the housing context? Methods Comparison of case outcomes between groups:

  • tenants with no representation
  • tenants with limited scope representation
  • tenants with full scope representation.
  • Jessica Steinberg (2011), In Pursuit of Justice
slide-37
SLIDE 37

Learning from In P In Pursuit o

  • f J

Justice?

37

Outcome measure Unrepresented Limited scope Full scope Possession 14% 18% 55% Mean days to move 47 54 97

  • Insight: in this jurisdiction, lawyer may be needed

at each step

  • Return to theory: what have we learned about

the problem we are trying to solve?

  • Return to logic model: how might we adapt our

program in light of new information?

slide-38
SLIDE 38

Demo of data analysis

38

https://docs.google.com/spreadsheets/d/1RUobI78KlCXhq5_qCaV3KgRcydv6zUvXYgJrR_52Aj0/edit?usp=sharing

slide-39
SLIDE 39

Tech tips: new skills, new tools

39

slide-40
SLIDE 40

Quick brainstorm

40

What is on

  • ne way that you could use an

quantitative m method to evaluate your pro bono program?

slide-41
SLIDE 41

Qualitative methods

41

  • To understand the nuance - expectations,

experiences, reasons, perceptions

  • Good for why and how questions
  • Inductive: generating ideas and theories
  • Examples for legal aid/pro bono:
  • Individual interviews with clients to explore

how comfortable they felt working with their pro bono attorney

  • Focus groups with pro bono attorneys to

explore experiences of new mentorship model

slide-42
SLIDE 42

Qualitative methods detail

42

Formats

  • Semi-structured interview
  • Topic guides: sequence of prompts/questions
  • Individual interviews: individualized
  • Focus groups: collective
  • Take notes or use a recorder

Samples:

  • Principle of saturation
  • Professional rule of thumb = 25 participants
  • But think about subgroups
slide-43
SLIDE 43

Qualitative methods: example

43

  • Sarah Sternberg Greene (2016), Race, Class and

Access to Civil Justice

  • What explains inaction in response to legal

problems? Why does this differ by race?

  • 97 interviews with public housing residents in

Cambridge, MA

  • Interview explored a range of related topics

that might influence decision-making on using legal aid (e.g. past experience with justice system and other social institutions)

slide-44
SLIDE 44

Key findings

44

  • Negative experiences/perceptions spillover from

criminal justice system

  • View that your treatment in justice system is

dependent on money

  • Prior experience w/ public institutions -

“ashamed, inadequate, degraded, and confused”

  • Desire for self-sufficiency
  • Racial differences in levels of trust and in level of

comfort seeking help from formal systems

slide-45
SLIDE 45

Quotation from study

45

“More money, more justice. I mean it. More money, more justice. It is true. The more money you have for an attorney, whether you are a big case or not, the more

  • justice. If you have more money, they have more time to

do the paperwork, investigate, that kind of thing. Oh I can get an attorney, let me tell you. No problem at all. But it won't be one of the good ones.”

slide-46
SLIDE 46

Listening exercise

46

https://www.legalaidsmc.org/videos/

slide-47
SLIDE 47

Quick brainstorm

47

What is on

  • ne way that you could use an

qualitative m method to evaluate your pro bono program?

slide-48
SLIDE 48

Observational methods

48

  • Specifically motivated observation
  • Often used in design research (recall Hagan)
  • Best for understanding (inter)actions - often

phenomena without a formal record

  • Consider the impact of the observer
  • Examples for legal aid/pro bono:
  • Observing client flow at workshop
  • Observing client interviews
  • Observing court operations & hearings
slide-49
SLIDE 49

Observational methods: example

49

  • Barbara Bezdek (1992), Silence in Court
  • Classic study of Baltimore rent court
  • Observing the functioning of an institution

and its exclusionary dynamics

slide-50
SLIDE 50

Quick brainstorm

50

What is on

  • ne way that you could use an
  • b
  • bservation
  • nal m

method

  • d to evaluate your pro

bono program?

slide-51
SLIDE 51

Exercise: Part 2

51

Methods section

  • What method(s), why?
  • How are method(s)

tailored to question?

slide-52
SLIDE 52

Implementing impact evaluation & data tracking in legal aid

52

slide-53
SLIDE 53

Looking ahead

53

slide-54
SLIDE 54

Defining scope

54

Monitoring Systems Evaluation Projects

slide-55
SLIDE 55

Timelines

55

Preparation Data collection Analysis

slide-56
SLIDE 56

Management & resources

56

slide-57
SLIDE 57

Logistics: process changes

57

slide-58
SLIDE 58

Partnerships

58

slide-59
SLIDE 59

Previewing results

59

slide-60
SLIDE 60

Example: ILRC NAC evaluation

60 https://newamericanscampaign.org/about/our-impact/

?

slide-61
SLIDE 61

Exercise: Part 3

61

Evaluation Plan

  • Scope
  • Timings
  • Logistics
  • Results preview
slide-62
SLIDE 62

Collaborating with funders: feedback & insight session

62

slide-63
SLIDE 63

Summary of exercise

63

  • What is the goal of this program?
  • How is the program designed to achieve this goal?
  • What key evaluation questions have been identified?
  • What methodology will be used to answer these

questions?

  • What kind of results would indicate success?
slide-64
SLIDE 64

Sequence of events

64

  • 1. Participant shares ideas with partner for feedback

and development

  • 2. Switch partners; repeat exercise using new

information

  • 3. Group discussion to explore learning from exercise
slide-65
SLIDE 65

...to our thought p partners!

65

slide-66
SLIDE 66

Breaktime!

66

slide-67
SLIDE 67

Individual capstone activity

  • Outline the elements of internal analysis for

your capstone issue(s) that you plan to research.

  • Deadline reminder: January 4, 2019: Submit
  • ne-page description of the internal analysis to

Renée and Lea via email

67

slide-68
SLIDE 68

Today, we...

  • 1. Introduced evaluation frameworks, such as

logic models and theory of change

  • 2. Learned some basics about quantitative and

qualitative models

  • 3. Started creating an evaluation and data tracking

plan for your pro bono program

  • 4. Road-tested your evaluation plan with Funders
  • 5. Started work on the Internal Analysis for your

capstone project

68

slide-69
SLIDE 69

One-word closeout

69

slide-70
SLIDE 70

Thank you!

  • Complete evaluation surveys

70