SLIDE 16 MCMC — Gibbs sampling
- The objective is to build a Markov Chain (MC) that converges to the desired
target distribution p (eg the unknown posterior distribution of some parameter of interest)
- Usually easy to create a MC, under mild “regularity conditions”
- The Gibbs sampling (GS) is one of the most popular schemes for MCMC
- 1. Select a set of initial values (θ(0)
1 , θ(0) 2 , . . . , θ(0) J )
1
from the conditional distribution p(θ1 | θ(0)
2 , θ(0) 3 , . . . , θ(0) J , y)
Sample θ(1)
2
from the conditional distribution p(θ2 | θ(1)
1 , θ(0) 3 , . . . , θ(0) J , y)
. . . Sample θ(1)
J
from the conditional distribution p(θJ | θ(1)
1 , θ(1) 2 , . . . , θ(1) J−1, y)
- 3. Repeat step 2. for S times until convergence is reached to the target
distribution p(θ | y)
- 4. Use the sample from the target distribution to compute all relevant statistics:
(posterior) mean, variance, credibility intervals, etc.
- If the full conditionals are not readily available, they need to be estimated
(eg via Metropolis-Hastings or slice sampling) before applying the GS
Gianluca Baio ( UCL) Introduction to INLA Bayes 2013, 21 May 2013 6 / 92