Augmenting simple models with machine learning
Jim Savage Data Science Lead Lendable @khakieconomist
Augmenting simple models with machine learning Jim Savage Data - - PowerPoint PPT Presentation
Augmenting simple models with machine learning Jim Savage Data Science Lead Lendable @khakieconomist Cheers to Sarah Tan David Miller Chris Edmond Eugene Dubossarsky Outline Estimating causal relationships Proximity
Jim Savage Data Science Lead Lendable @khakieconomist
series models
Question for the audience: What is a college degree worth? How would you go about estimating it?
Experimental data = easy causal inference Observational data = hard “causal” inference
X)— how much we expect y to change
intervention in X
correlation
fancier models often just make us more certain
The fundamental problem of causal inference is that booting up a parallel universe whenever we want to draw causal inference is too much work.
effect of observed confounders)
averages!
IV hard!)
vary over time (fixed effects assumption violated)
as possible to the treatment group
regression, etc) on this sub-group. Discard those who were never likely to take up treatment.
as possible to the treatment group
some causal model.
that matter.
covariates.
Mahalanobis, etc.)
space
untreated observation whose modelled propensity is closest (or some other matching technique).
treatment effect. Can be meaningless.
account how the Xs affect treatment probability.
data
to be considered for a split
tree, they are said to be proximate
nodes shared by individuals i and j.
terms of their Xs
to y
score
today.
likely to do a good job.
the observations in your parametric model
relevant history
normally take a weights argument.
included.
sampling notation
When should I ignore my model?
Σ = diag(σ)Ωdiag(σ)
is a correlation matrix
and Omega an LKJ prior
uniform prior over correlations.
returns vector(t) ~ multivariate distribution(expected return(t), covariance(t))
magnitudes
update with correlated shocks
correlation structure from the data.
does not impact the posterior and we revert to the prior.
freedom gives us highly correlated returns in unprecedented states.
Questions?