Multi-parameter models - Gibbs Sampling
Applied Bayesian Statistics
- Dr. Earvin Balderama
Department of Mathematics & Statistics Loyola University Chicago
September 28, 2017
1
Gibbs Sampling Last edited October 1, 2017 by <ebalderama@luc.edu>
Multi-parameter models - Gibbs Sampling Applied Bayesian Statistics - - PowerPoint PPT Presentation
Multi-parameter models - Gibbs Sampling Applied Bayesian Statistics Dr. Earvin Balderama Department of Mathematics & Statistics Loyola University Chicago September 28, 2017 Gibbs Sampling 1 Last edited October 1, 2017 by
Applied Bayesian Statistics
Department of Mathematics & Statistics Loyola University Chicago
September 28, 2017
1
Gibbs Sampling Last edited October 1, 2017 by <ebalderama@luc.edu>
Multi-parameter models
The goal is to find the posterior distribution. The posterior is used for inference about the parameter(s) of interest: compute summaries such as posterior means, variances, and quantiles. credible intervals, hypothesis testing, model diagnostics. We have seen a couple of ways of finding the posterior:
1
Using conjugate priors that lead to a known family.
2
Evaluating the function on a grid. But oftentimes we are working with a model with many parameters, so the above methods can get very difficult, or even impossible, to perform.
2
Gibbs Sampling Last edited October 1, 2017 by <ebalderama@luc.edu>
Multi-parameter models
Let θ = (θ1, . . . , θp) be the p parameters in the model. In Monte Carlo methods, we draw samples of θ from a (possibly unfamiliar) posterior distribution f(θ |Y), and use these samples θ(1), θ(2), . . . , θ(S) to approximate posterior summaries.
3
Gibbs Sampling Last edited October 1, 2017 by <ebalderama@luc.edu>
Multi-parameter models
Monte Carlo sampling is the predominant method of Bayesian inference because it can be used for high-dimensional models (i.e., with many parameters). Many software options for performing Monte Carlo sampling:
R (BLR, MCMClogit, or write your own function) SAS (proc mcmc) OpenBUGS/WinBUGS (or simply BUGS) JAGS (rjags) Stan (rstan) INLA
4
Gibbs Sampling Last edited October 1, 2017 by <ebalderama@luc.edu>
Multi-parameter models
The main idea is to break up the problem of sampling from the high-dimensional joint distribution into a series (chain) of samples from low-dimensional conditional distributions. Note: Rather than drawing one p-dimensional joint sample, we make p
Samples are drawn (updated) one-at-a-time for each parameter. The updates are done in a loop, so samples are not independent. Because samples depend on previous samples drawn, the collection of samples turns out to be a Markov distribution, leading to the name Markov chain Monte Carlo (MCMC). The most common MCMC sampling algorithms are
1
Gibbs
2
Metropolis
3
Metropolis-Hastings
5
Gibbs Sampling Last edited October 1, 2017 by <ebalderama@luc.edu>
Multi-parameter models
1
Gibbs sampling was proposed in the early 1990s (Geman and Geman, 1984; Gelfand and Smith, 1990) and fundamentally changed Bayesian computing.
2
Gibbs sampling is attractive because it can sample from high-dimensional posteriors.
3
The main idea is to break the problem of sampling from the high-dimensional joint distribution into a series of samples from low-dimensional conditional distributions, e.g., rather than 1 p-dimensional joint sample, we make p 1-dimensional samples.
4
Updates can also be done in blocks (groups of parameters).
5
Because the low-dimensional updates are done in a loop, samples are not independent.
6
The dependence turns out to be a Markov distribution, leading to the name Markov chain Monte Carlo (MCMC).
6
Gibbs Sampling Last edited October 1, 2017 by <ebalderama@luc.edu>
Multi-parameter models
1
Set initial values θ(0) =
1 , . . . , θ(0) p
For iteration t, Draw θ(t)
1
2
, . . . , θ(t−1)
p
, Y Draw θ(t)
2
1 , θ(t−1) 3
, . . . , θ(t−1)
p
, Y . . . Draw θ(t)
p
1 , . . . , θ(t) p−1, Y
Set θ(t) =
1 , . . . , θ(t) p
7
Gibbs Sampling Last edited October 1, 2017 by <ebalderama@luc.edu>
Multi-parameter models
The joint posterior of (µ, σ2) is f
f
f
∝ 1 σ −n exp
(yi − µ)2 2σ2
(µ − θ)2 2τ 2
σ2
µ
y+mθ n+m , σ2 n+m
n
2 + a, SSE 2
+ b
Gibbs Sampling Last edited October 1, 2017 by <ebalderama@luc.edu>
Multi-parameter models
1
Set initial values θ(0) =
= ¯ y, s2
2
For iteration t, Draw µ(t)
Draw σ2(t)
Set θ(t) =
After S iterations, we have θ(1), . . . , θ(S) =
, . . . ,
µ =
σ2 =
9
Gibbs Sampling Last edited October 1, 2017 by <ebalderama@luc.edu>