Measurement and DAGs February 5, 2020 Fill out your reading report - - PowerPoint PPT Presentation

measurement and dags
SMART_READER_LITE
LIVE PREVIEW

Measurement and DAGs February 5, 2020 Fill out your reading report - - PowerPoint PPT Presentation

Measurement and DAGs February 5, 2020 Fill out your reading report PMAP 8521: Program Evaluation for Public Service on iCollege! Andrew Young School of Policy Studies Spring 2020 Plan for today Abstraction, stretching, and validity Causal


slide-1
SLIDE 1

Measurement and DAGs

February 5, 2020

PMAP 8521: Program Evaluation for Public Service Andrew Young School of Policy Studies Spring 2020 Fill out your reading report

  • n iCollege!
slide-2
SLIDE 2

Plan for today

Abstraction, stretching, and validity Causal models Equations, paths, doors, and adjustment

slide-3
SLIDE 3

Abstraction, stretching, and validity

slide-4
SLIDE 4

Inputs, activities, & outputs Outcomes

Generally directly measurable Harder to directly measure

# of citations mailed, % increase in grades, etc. Commitment to school, reduced risk factors

Indicators

slide-5
SLIDE 5
slide-6
SLIDE 6
slide-7
SLIDE 7

Conceptual stretching

slide-8
SLIDE 8

Enmagicked

Ladder of abstraction for witches

Female Human Mammal Young Old Student

Hermione Granger Sabrina Spellman Trolls, elves, gods/goddesses Arwen, Winky, Athena Elphaba Halloween decorations Salem witch trials

slide-9
SLIDE 9

Connection to theory

slide-10
SLIDE 10

Juvenile delinquency School performance Poverty

Practice

Choose an outcome List all the possible attributes of that outcome Build a ladder of abstraction with all the attributes Determine which level is sufficient for showing an effect

slide-11
SLIDE 11

Outcome variable Outcome change Program effect

Outcomes and programs

Thing you’re measuring ∆ in thing you’re measuring over time ∆ in thing you’re measuring over time because of the program

slide-12
SLIDE 12

Outcomes and programs

Post-program outcome level Outcome with program Outcome without program Outcome change Outcome variable Before program During program After program Program effect Pre-program

  • utcome level
slide-13
SLIDE 13

Juvenile delinquency School performance Poverty

Connecting measurement to programs

Measurable definition of program effect Ideal measurement Feasible measurement Connection to real world

slide-14
SLIDE 14

Causal models

slide-15
SLIDE 15

Types of data

Experimental Observational

You have control over which units get treatment You don’t have control over which units get treatment

Which kind lets you prove causation?

slide-16
SLIDE 16

Causation with observational data

Can you prove causation with

  • bservational data?

Why is it so controversial to use

  • bservational data?
slide-17
SLIDE 17
slide-18
SLIDE 18

The causal revolution

slide-19
SLIDE 19

Causal diagrams

Directed acyclic graphs (DAGs)

Graphical model of the process that generates the data Maps your philosophical model Fancy math (“do-calculus”) tells you what to control for to find causation

slide-20
SLIDE 20
  • X

Y

Directed acyclic graphs encode our understanding of the causal model (or philosophy)

DAGs

slide-21
SLIDE 21

What is the causal effect of an additional year of education on earnings?

Step 1: List variables Step 2: Simplify Step 3: Connect arrows Step 4: Use logic and math to determine which nodes and arrows to measure

slide-22
SLIDE 22

Education (treatment) Earnings (outcome) List anything that’s relevant Things that cause or are caused by treatment, especially if they’re related to both treatment and outcome You don’t have to actually observe or measure them all

  • 1. List variables
slide-23
SLIDE 23

Education (treatment) Earnings (outcome) Socioeconomic status Year of birth Ability Demographics Location Compulsory schooling laws Job connections

  • 1. List variables
slide-24
SLIDE 24

Education (treatment) Earnings (outcome) Socioeconomic status Year of birth Ability Demographics Location Compulsory schooling laws Job connections Background

  • 2. Simplify
slide-25
SLIDE 25

Edu Earn

Education causes earnings

  • 3. Draw arrows
slide-26
SLIDE 26
  • 3. Draw arrows

Bkgd Edu Loc Req Year Earn

Background, year of birth, location, school requirements all cause education

slide-27
SLIDE 27

Bkgd Edu JobCx Loc Req Year Earn

Background, year of birth, and location all effect earnings too

  • 3. Draw arrows
slide-28
SLIDE 28

Bkgd Edu JobCx Loc Req Year Earn

Job connections are caused by education

  • 3. Draw arrows
slide-29
SLIDE 29

Bkgd Edu JobCx Loc Req U1 Year Earn

Location and background are probably related, but neither causes the other. Something unobservable does that (U1)

  • 3. Draw arrows
slide-30
SLIDE 30

Bkgd Edu JobCx Loc Req U1 Year Earn

dagitty.net

Let the computer do this!

slide-31
SLIDE 31

Does a longer night’s sleep extend your lifespan?

Step 1: List variables Step 2: Simplify Step 3: Connect arrows Use dagitty.net

Your turn

slide-32
SLIDE 32

Equations, paths, doors, and adjustment

slide-33
SLIDE 33

Causal identification

Bkgd Edu JobCx Loc Req U1 Year Earn

All these nodes are related; there’s correlation between them all We care about Edu → Earn, but what do we do with all the other nodes?

slide-34
SLIDE 34

Causal identification A causal effect is “identified” if the association between treatment and

  • utcome is properly stripped and

isolated

slide-35
SLIDE 35

Paths and associations Arrows in a DAG transmit associations You can redirect and control those paths by “adjusting” or “conditioning”

slide-36
SLIDE 36

Causation

Three types of associations

Confounding Collision

Common cause Mediation Selection / Endogeneity

slide-37
SLIDE 37

X causes Y

Confounding

But Z causes both X and Y Z confounds X → Y association

slide-38
SLIDE 38

Paths between X and Y?

Paths

X → Y X ← Z → Y Z is a backdoor

slide-39
SLIDE 39

Paths between money and win margin?

Money → Margin Money ← Quality → Margin Backdoor!

slide-40
SLIDE 40
slide-41
SLIDE 41

Close the backdoor by adjusting for Z

Closing doors

slide-42
SLIDE 42

Find what part of X (campaign money) is explained by Q (quality), subtract it out. This creates the residual part of X. Find what part of Y (the win margin) is explained by Q (quality), subtract it out. This creates the residual part of Y. Find relationship between residual part of X and residual part of Y. This is the causal effect.

slide-43
SLIDE 43

We’re comparing candidates as if they had the same quality Holding quality constant We remove differences that are predicted by quality

slide-44
SLIDE 44

Include term in regression

Win margin = 0 + 1Campaign money + 2Candidate quality + ✏

<latexit sha1_base64="o5HLXxGhe/M81/uQ/f19Qtjo=">ACUnicbVJNixNBEO1k3Q/j7prVo5fGIAhCmImCXoTFXDxGMB+QhFDTU8kW290zdtcshiG/cUG87A/x4kHtJEPQrAUNj/deUdWvO8k1eY6iu1r94MHh0fHJw8aj07Pzx82LJwOfFU5hX2U6c6MEPGqy2GdijaPcIZhE4zC57q714Q06T5n9zMscpwYWluakgAM1a9KE8SuXQ7LSgFuQXcn3cpIgwySryoUy62rCyYHWgRrZnG52umdnW5TSoFRfilAE28tmHvS61mtqB1tSt4HcQVaoqrerPltkmaqMGhZafB+HEc5T0twTErjqjEpPOagrmGB4wAtGPTchPJSr4ITCrnmQvHstywf3eUYLxfmiQ4DfCV39fW5P+0cHzd9OSbF4wWrUdNC+05Eyu85UpOVSslwGAchR2leoKHCgOr9AIcT7V74PBp12/Lrd+fSmdfmhiuNEPBPxUsRi7fiUnwUPdEXStyKH+KX+F37XvtZD79ka63Xqp6n4p+qn/4B9Rez3Q=</latexit>

Win margin = ↵ + Campaign money + Candidate quality + ✏

<latexit sha1_base64="bak5KZt7lcpKt0gTEobmPDf1LJ4=">ACUnicbVJBaxNBFJ7Eqm1aNeqxl6GhIAhty3oRSj24rGCaQrZUN7OvmwenZmdzrwVw5LfKIgXf4gXD+pskNtfTDw8X3f4735ZnKnKXCS/Oh0H2w9fPR4e6e3u/fk6bP+8xcXoaq9wpGqdOUvcwioyeKIiTVeOo9gco3j/Pqs1cef0Qeq7CdeOJwaKC3NSAFH6qpPGeMXbsZkpQFfkl3KdzID7eYgX8sRwaZybXpDIwDKqOzsrhYtnoJxtw2IKYJQ3NWjitQdIN3OGiTDZFXyPkg3YCA2dX7V/5YVlaoNWlYaQpikieNpA5JaVz2sjqgA3UNJU4itGAwTJtVJEt5GJlCziofj2W5Ym93NGBCWJg8Og3wPNzVWvJ/2qTm2dtpQ9bVjFatB81qLbmSb6yI+K9SICUJ7irlLNwYPi+Aq9GEJ698r3wcXRMD0eHn08GZy+38SxLfbFgXglUvFGnIoP4lyMhBJfxU/xW/zpfO/86sZfsrZ2O5uel+Kf6u79BSRQs/Y=</latexit>

How to adjust?

Matching Do-calculus Inverse probability weighting