Introduction to Bayesian Computation
- Dr. Jarad Niemi
STAT 544 - Iowa State University
March 26, 2019
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 1 / 30
Introduction to Bayesian Computation Dr. Jarad Niemi STAT 544 - - - PowerPoint PPT Presentation
Introduction to Bayesian Computation Dr. Jarad Niemi STAT 544 - Iowa State University March 26, 2019 Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 1 / 30 Bayesian computation Goals: E | y [ h ( ) |
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 1 / 30
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 2 / 30
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 3 / 30
Deterministic methods
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 4 / 30
Deterministic methods
y = 1 # Data q = function(theta, y, log = FALSE) {
if (log) return(out) return(exp(out)) } # Find normalizing constant for q(theta|y) w = 0.1 theta = seq(-5,5,by=w)+y (cy = sum(q(theta,y)*w)) # gridding based approach [1] 1.305608 integrate(function(x) q(x,y), -Inf, Inf) # numerical integration 1.305609 with absolute error < 0.00013 Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 5 / 30
Deterministic methods
curve(q(x,y)/cy, -5, 5, n=1001) points(theta,rep(0,length(theta)), cex=0.5, pch=19) segments(theta,0,theta,q(theta,y)/cy) −4 −2 2 4 0.0 0.1 0.2 0.3 0.4 0.5 x q(x, y)/cy Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 6 / 30
Deterministic methods
h = function(theta) theta sum(w*h(theta)*q(theta,y)/cy) [1] 0.5542021 Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 7 / 30
Deterministic methods
−5.0 −2.5 0.0 2.5 5.0 −5.0 −2.5 0.0 2.5 5.0
y Posterior expectation prior
Cauchy normal improper uniform
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 8 / 30
Monte Carlo methods
Three main notions of convergence of a sequence of random variables X1, X2, . . . and a random variable X: Convergence in distribution (Xn
d
→ X): lim
n→∞ Fn(X) = F (x).
Convergence in probability (WLLN, Xn
p
→ X): lim
n→∞ P (|Xn − X| ≥ ǫ) = 0.
Almost sure convergence (SLLN, Xn
a.s.
− → X): P
n→∞ Xn = X
Implications: Almost sure convergence implies convergence in probability. Convergence in probability implies convergence in distribution. Here, Xn will be our approximation to an integral and X the true (constant) value of that integral or Xn will be a standardized approximation and X will be N(0, 1). Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 9 / 30
Monte Carlo methods
S
a.s.
d
S
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 10 / 30
Monte Carlo methods Definite integral
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 11 / 30
Monte Carlo methods Definite integral
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 12 / 30
Monte Carlo methods Definite integral
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 13 / 30
Monte Carlo methods Definite integral
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 14 / 30
Monte Carlo methods Infinite bounds
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 15 / 30
Monte Carlo methods Infinite bounds
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 16 / 30
Monte Carlo methods Infinite bounds
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 17 / 30
Monte Carlo methods Gridding
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 18 / 30
Monte Carlo methods Gridding
y = 1 # Data # Small number of grid locations theta = seq(-5,5,length=1e2+1)+y; p = q(theta,y)/sum(q(theta,y)); sum(p*theta) [1] 0.5542021 mean(sample(theta,prob=p,replace=TRUE)) [1] 0.6118812 # Large number of grid locations theta = seq(-5,5,length=1e6+1)+y; p = q(theta,y)/sum(q(theta,y)); sum(p*theta) [1] 0.5542021 mean(sample(theta,1e2,prob=p,replace=TRUE)) # But small MC sample [1] 0.598394 # Truth post_expectation(1) [1] 0.5542021 Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 19 / 30
Monte Carlo methods Inverse CDF method
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 20 / 30
Monte Carlo methods Inverse CDF method
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 21 / 30
Monte Carlo methods Inverse CDF method
0.00 0.25 0.50 0.75 1.00 0.0 2.5 5.0 7.5
x density
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 22 / 30
Monte Carlo methods Inverse CDF method
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 23 / 30
Monte Carlo methods Inverse CDF method
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 24 / 30
Monte Carlo methods Inverse CDF method
0.0 0.1 0.2 2 4 6
x density
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 25 / 30
Monte Carlo methods Rejection sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 26 / 30
Monte Carlo methods Rejection sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 27 / 30
Monte Carlo methods Rejection sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 28 / 30
Monte Carlo methods Rejection sampling
0.00 0.25 0.50 0.75 1.00 −2 −1 1 2 3
sample u M g(θ) accept
FALSE TRUE
Observed acceptance rate was 0.52 Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 29 / 30
Monte Carlo methods Rejection sampling
Jarad Niemi (STAT544@ISU) Introduction to Bayesian Computation March 26, 2019 30 / 30