Introduction to Bayesian computation (cont.)
- Dr. Jarad Niemi
STAT 544 - Iowa State University
March 23, 2017
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 1 / 19
Introduction to Bayesian computation (cont.) Dr. Jarad Niemi STAT - - PowerPoint PPT Presentation
Introduction to Bayesian computation (cont.) Dr. Jarad Niemi STAT 544 - Iowa State University March 23, 2017 Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 1 / 19 Outline Bayesian computation Adaptive
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 1 / 19
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 2 / 19
Adaptive rejection sampling
x y f(x) f(y)
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 3 / 19
Adaptive rejection sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 4 / 19
Adaptive rejection sampling
log density density
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 5 / 19
Adaptive rejection sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 6 / 19
Adaptive rejection sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 7 / 19
Adaptive rejection sampling
library(ars) x = ars(n=1000, function(x) -x^2/2, function(x) -x) hist(x, prob=T, 100) curve(dnorm, type='l', add=T)
Histogram of x
x Density −3 −2 −1 1 2 3 0.0 0.1 0.2 0.3 0.4 0.5 0.6 Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 8 / 19
Adaptive rejection sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 9 / 19
Adaptive rejection sampling Importance sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 10 / 19
Adaptive rejection sampling Importance sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 11 / 19
Adaptive rejection sampling Importance sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 12 / 19
Adaptive rejection sampling Importance sampling
0.0005 0.0010 0.0015 0.0020 −2 2 4
θ Normalized importance weight
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 13 / 19
Adaptive rejection sampling Importance sampling library(weights) sum(weight*theta/sum(weight)) # Estimate mean [1] 0.5504221 wtd.hist(theta, 100, prob=TRUE, weight=weight) curve(q(x,y)/py(y), add=TRUE, col="red", lwd=2)
Histogram of theta
theta Density −2 −1 1 2 3 4 0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 14 / 19
Adaptive rejection sampling Importance sampling
# resampling new_theta = sample(theta, replace=TRUE, prob=weight) # internally normalized hist(new_theta, 100, prob=TRUE, main="Unweighted histogram of resampled draws"); curve(q(x,y)/py(y), add=TRUE,
Unweighted histogram of resampled draws
new_theta Density −1 1 2 3 0.0 0.2 0.4 0.6 0.8 Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 15 / 19
Adaptive rejection sampling Importance sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 16 / 19
Adaptive rejection sampling Importance sampling
0.00 0.01 0.02 0.03 −2 2
θ Normalized importance weight
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 17 / 19
Adaptive rejection sampling Importance sampling
(n <- length(weight)) # Number of samples [1] 1000 (ess <- 1/sum(weight^2)) # Effective sample size [1] 371.432 ess/n # Effective sample proportion [1] 0.371432 Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 18 / 19
Adaptive rejection sampling Importance sampling
set.seed(5) theta = rnorm(1e4) lweight = dcauchy(theta,log=TRUE)-dnorm(theta,log=TRUE) cumulative_ess = length(lweight) for (i in 1:length(lweight)) { lw = lweight[1:i] w = exp(lw-max(lw)) w = w/sum(w) cumulative_ess[i] = 1/sum(w^2) } qplot(x=1:length(cumulative_ess), y=cumulative_ess, geom="line") + labs(x="Number of samples", y="Effective sample size") + theme_bw()
1000 2000 3000 2500 5000 7500 10000
Number of samples Effective sample size
Jarad Niemi (STAT544@ISU) Introduction to Bayesian computation (cont.) March 23, 2017 19 / 19