The R-INLA package
Bayesian computing with INLA and the R-INLA package H avard Rue - - PowerPoint PPT Presentation
Bayesian computing with INLA and the R-INLA package H avard Rue - - PowerPoint PPT Presentation
The R-INLA package Bayesian computing with INLA and the R-INLA package H avard Rue Norwegian University of Science and Technology Trondheim, Norway July 24, 2013 The R-INLA package Plan of this talk Plan of this talk Background of
The R-INLA package Plan of this talk
Plan of this talk
- Background of Bayesian computing
- Maybe it is useful not to be so general?
- Latent Gaussian models (LGMs)
- The aim: approximating marginals
- The key tools
- The R-INLA package
- Some special features
- The road ahead Version 1
- The road ahead Version 2
- Discussion
The R-INLA package Plan of this talk
Plan of this talk
- Background of Bayesian computing
- Maybe it is useful not to be so general?
- Latent Gaussian models (LGMs)
- The aim: approximating marginals
- The key tools
- The R-INLA package
- Some special features
- The road ahead Version 1
- The road ahead Version 2
- Discussion
The R-INLA package Plan of this talk
Plan of this talk
- Background of Bayesian computing
- Maybe it is useful not to be so general?
- Latent Gaussian models (LGMs)
- The aim: approximating marginals
- The key tools
- The R-INLA package
- Some special features
- The road ahead Version 1
- The road ahead Version 2
- Discussion
The R-INLA package Plan of this talk
Plan of this talk
- Background of Bayesian computing
- Maybe it is useful not to be so general?
- Latent Gaussian models (LGMs)
- The aim: approximating marginals
- The key tools
- The R-INLA package
- Some special features
- The road ahead Version 1
- The road ahead Version 2
- Discussion
The R-INLA package Plan of this talk
Plan of this talk
- Background of Bayesian computing
- Maybe it is useful not to be so general?
- Latent Gaussian models (LGMs)
- The aim: approximating marginals
- The key tools
- The R-INLA package
- Some special features
- The road ahead Version 1
- The road ahead Version 2
- Discussion
The R-INLA package Plan of this talk
Plan of this talk
- Background of Bayesian computing
- Maybe it is useful not to be so general?
- Latent Gaussian models (LGMs)
- The aim: approximating marginals
- The key tools
- The R-INLA package
- Some special features
- The road ahead Version 1
- The road ahead Version 2
- Discussion
The R-INLA package Plan of this talk
Plan of this talk
- Background of Bayesian computing
- Maybe it is useful not to be so general?
- Latent Gaussian models (LGMs)
- The aim: approximating marginals
- The key tools
- The R-INLA package
- Some special features
- The road ahead Version 1
- The road ahead Version 2
- Discussion
The R-INLA package Plan of this talk
Plan of this talk
- Background of Bayesian computing
- Maybe it is useful not to be so general?
- Latent Gaussian models (LGMs)
- The aim: approximating marginals
- The key tools
- The R-INLA package
- Some special features
- The road ahead Version 1
- The road ahead Version 2
- Discussion
The R-INLA package Plan of this talk
Plan of this talk
- Background of Bayesian computing
- Maybe it is useful not to be so general?
- Latent Gaussian models (LGMs)
- The aim: approximating marginals
- The key tools
- The R-INLA package
- Some special features
- The road ahead Version 1
- The road ahead Version 2
- Discussion
The R-INLA package Plan of this talk
Plan of this talk
- Background of Bayesian computing
- Maybe it is useful not to be so general?
- Latent Gaussian models (LGMs)
- The aim: approximating marginals
- The key tools
- The R-INLA package
- Some special features
- The road ahead Version 1
- The road ahead Version 2
- Discussion
The R-INLA package Others...
The core-team
Finn Lindgren Daniel Simpson Thiago Martins Andrea Ribler
The R-INLA package Background
Background
- The issue of Bayesian computing is not “solved” even though
MCMC is available
- Hierarchical models are more difficult for MCMC
- The main obstacle for Bayesian modelling is the issue of
“Bayesian computing”
- Generic MCMC-tools available: JAGS/OpenBUGS,
- Tools based on MCMC are available for specific model classes,
like BayesX and stan
- All these tools have a nice R-interface
The R-INLA package Background
Background
- The issue of Bayesian computing is not “solved” even though
MCMC is available
- Hierarchical models are more difficult for MCMC
- The main obstacle for Bayesian modelling is the issue of
“Bayesian computing”
- Generic MCMC-tools available: JAGS/OpenBUGS,
- Tools based on MCMC are available for specific model classes,
like BayesX and stan
- All these tools have a nice R-interface
The R-INLA package Background
Background
- The issue of Bayesian computing is not “solved” even though
MCMC is available
- Hierarchical models are more difficult for MCMC
- The main obstacle for Bayesian modelling is the issue of
“Bayesian computing”
- Generic MCMC-tools available: JAGS/OpenBUGS,
- Tools based on MCMC are available for specific model classes,
like BayesX and stan
- All these tools have a nice R-interface
The R-INLA package Background
Background
- The issue of Bayesian computing is not “solved” even though
MCMC is available
- Hierarchical models are more difficult for MCMC
- The main obstacle for Bayesian modelling is the issue of
“Bayesian computing”
- Generic MCMC-tools available: JAGS/OpenBUGS,
- Tools based on MCMC are available for specific model classes,
like BayesX and stan
- All these tools have a nice R-interface
The R-INLA package Background
Background
- The issue of Bayesian computing is not “solved” even though
MCMC is available
- Hierarchical models are more difficult for MCMC
- The main obstacle for Bayesian modelling is the issue of
“Bayesian computing”
- Generic MCMC-tools available: JAGS/OpenBUGS,
- Tools based on MCMC are available for specific model classes,
like BayesX and stan
- All these tools have a nice R-interface
The R-INLA package Background
Background
- The issue of Bayesian computing is not “solved” even though
MCMC is available
- Hierarchical models are more difficult for MCMC
- The main obstacle for Bayesian modelling is the issue of
“Bayesian computing”
- Generic MCMC-tools available: JAGS/OpenBUGS,
- Tools based on MCMC are available for specific model classes,
like BayesX and stan
- All these tools have a nice R-interface
The R-INLA package Do we need to be general?
The problem of generality
Although MCMC
- is applicable in general
- converge under mild conditions (even simple schemes)
- is asymptotically fine
it is often to slow. However, “slow” can be fine.
The R-INLA package Do we need to be general?
The problem of generality
Although MCMC
- is applicable in general
- converge under mild conditions (even simple schemes)
- is asymptotically fine
it is often to slow. However, “slow” can be fine.
The R-INLA package Do we need to be general?
The problem of generality
Although MCMC
- is applicable in general
- converge under mild conditions (even simple schemes)
- is asymptotically fine
it is often to slow. However, “slow” can be fine.
The R-INLA package Do we need to be general?
The problem of generality
Although MCMC
- is applicable in general
- converge under mild conditions (even simple schemes)
- is asymptotically fine
it is often to slow. However, “slow” can be fine.
The R-INLA package Do we need to be general?
The problem of generality
Although MCMC
- is applicable in general
- converge under mild conditions (even simple schemes)
- is asymptotically fine
it is often to slow. However, “slow” can be fine.
The R-INLA package Do we need to be general?
Let us be more specific!
- Cannot make all (statisticians) happy
- Can make some (statisticians) more happy!
The R-INLA package Do we need to be general?
Let us be more specific!
- Cannot make all (statisticians) happy
- Can make some (statisticians) more happy!
The R-INLA package Latent Gaussian Models
Latent Gaussian models (LGMs)
- Most Bayesian models used are within this class
- This unification does not help modelling nor understanding,
but is very useful for computations
The R-INLA package Latent Gaussian Models
Latent Gaussian models (LGMs)
- Most Bayesian models used are within this class
- This unification does not help modelling nor understanding,
but is very useful for computations
The R-INLA package Latent Gaussian Models
Latent Gaussian models
θ ∼ π(θ) x | θ ∼ N(x ; µ(θ), Q(θ)) y | x, θ ∼
- i
π(yi | ηi, θ)
- This is a latent Gaussian models (LGMs)
- dim(x) is (typically) large 102-105
- dim(θ) is (typically) small 1-10
The R-INLA package Latent Gaussian Models
Latent Gaussian models
θ ∼ π(θ) x | θ ∼ N(x ; µ(θ), Q(θ)) y | x, θ ∼
- i
π(yi | ηi, θ)
- This is a latent Gaussian models (LGMs)
- dim(x) is (typically) large 102-105
- dim(θ) is (typically) small 1-10
The R-INLA package Latent Gaussian Models
Latent Gaussian models
θ ∼ π(θ) x | θ ∼ N(x ; µ(θ), Q(θ)) y | x, θ ∼
- i
π(yi | ηi, θ)
- This is a latent Gaussian models (LGMs)
- dim(x) is (typically) large 102-105
- dim(θ) is (typically) small 1-10
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
- Dynamic linear models
- Stochastic volatility
- Generalised linear (mixed) models
- Generalised additive (mixed) models
- Measurement error models
- Spline smoothing
- Semiparametric regression
- Space-varying (semiparametric) regression models
- Models for disease mapping
- Log-Gaussian Cox-processes
- Model-based geostatistics
- Spatio-temporal models
- Survival analysis
- +++
The R-INLA package Latent Gaussian Models
The aim: Approximate the posterior marginals
Compute from π(x, θ | y) ∝ π(θ) π(x | θ)
- i∈I
π(yi | xi, θ) the posterior marginals: π(xi | y), for some or all i and/or π(θi | y), for some or all i
The R-INLA package Latent Gaussian Models
End result
- Can we compute (approximate) marginals directly?
- YES!
- Gain
- Huge speedup & accuracy
- The ability to treat LGMs properly
The R-INLA package Latent Gaussian Models
End result
- Can we compute (approximate) marginals directly?
- YES!
- Gain
- Huge speedup & accuracy
- The ability to treat LGMs properly
The R-INLA package Latent Gaussian Models
End result
- Can we compute (approximate) marginals directly?
- YES!
- Gain
- Huge speedup & accuracy
- The ability to treat LGMs properly
The R-INLA package The main idea
Smoothing noisy observations (I)
Observations yi = m(i) + ǫi, i = 1, . . . , n for Gaussian iid noise ǫi with known precision. Will assume m(i) is a smooth function wrt i
The R-INLA package The main idea
Smoothing noisy observations (I)
Observations yi = m(i) + ǫi, i = 1, . . . , n for Gaussian iid noise ǫi with known precision. Will assume m(i) is a smooth function wrt i
The R-INLA package The main idea
Smoothing noisy observations (II)
n = 50 idx = 1:n fun = 100*((idx-n/2)/n)^3 y = fun + rnorm(n) plot(idx, y)
10 20 30 40 50 −10 −5 5 10 idx y
The R-INLA package The main idea
Smoothing noisy observations (III)
Likelihood Gaussian observations with known precision yi|xi, θ ∼ N(xi, τ0) Latent A Gaussian model for the smooth function π(x|θ) ∝ θ(n−2)/2 exp
- −θ
2
n
- i=2
(xi − 2xi−1 + xi−2)2
- Hyperparameter The smoothing parameter θ which we assign a
Γ(a, b) prior π(θ) ∝ θa−1 exp (−bθ) , θ > 0
The R-INLA package The main idea
Smoothing noisy observations (III)
Likelihood Gaussian observations with known precision yi|xi, θ ∼ N(xi, τ0) Latent A Gaussian model for the smooth function π(x|θ) ∝ θ(n−2)/2 exp
- −θ
2
n
- i=2
(xi − 2xi−1 + xi−2)2
- Hyperparameter The smoothing parameter θ which we assign a
Γ(a, b) prior π(θ) ∝ θa−1 exp (−bθ) , θ > 0
The R-INLA package The main idea
Smoothing noisy observations (III)
Likelihood Gaussian observations with known precision yi|xi, θ ∼ N(xi, τ0) Latent A Gaussian model for the smooth function π(x|θ) ∝ θ(n−2)/2 exp
- −θ
2
n
- i=2
(xi − 2xi−1 + xi−2)2
- Hyperparameter The smoothing parameter θ which we assign a
Γ(a, b) prior π(θ) ∝ θa−1 exp (−bθ) , θ > 0
The R-INLA package The main idea
Smoothing noisy observations (IV)
Since x, y|θ ∼ N(·, ·) we can compute (numerically) all marginals, using that π(θ|y) ∝
Gaussian
- π(x, y|θ) π(θ)
π(x|y, θ)
- Gaussian
and x|y, θ ∼ N(·, ·) so that π(xi|y) =
- π(xi|θ, y)
- Gaussian
π(θ|y) dθ
The R-INLA package The main idea
Smoothing noisy observations (IV)
Since x, y|θ ∼ N(·, ·) we can compute (numerically) all marginals, using that π(θ|y) ∝
Gaussian
- π(x, y|θ) π(θ)
π(x|y, θ)
- Gaussian
and x|y, θ ∼ N(·, ·) so that π(xi|y) =
- π(xi|θ, y)
- Gaussian
π(θ|y) dθ
The R-INLA package The main idea
Smoothing noisy observations (IV)
Since x, y|θ ∼ N(·, ·) we can compute (numerically) all marginals, using that π(θ|y) ∝
Gaussian
- π(x, y|θ) π(θ)
π(x|y, θ)
- Gaussian
and x|y, θ ∼ N(·, ·) so that π(xi|y) =
- π(xi|θ, y)
- Gaussian
π(θ|y) dθ
The R-INLA package The main idea
Smoothing noisy observations (IV)
Since x, y|θ ∼ N(·, ·) we can compute (numerically) all marginals, using that π(θ|y) ∝
Gaussian
- π(x, y|θ) π(θ)
π(x|y, θ)
- Gaussian
and x|y, θ ∼ N(·, ·) so that π(xi|y) =
- π(xi|θ, y)
- Gaussian
π(θ|y) dθ
The R-INLA package The main idea
1 2 3 4 5 6 0.0 0.2 0.4 0.6 0.8 1.0 log.prec exp(log.dens)
Posterior marginal for theta
The R-INLA package The main idea
1 2 3 4 5 6 0.0 0.2 0.4 0.6 0.8 1.0 log.prec exp(log.dens)
Posterior marginal for theta, interpolated
The R-INLA package The main idea
−14 −12 −10 −8 0.0 0.2 0.4 0.6 0.8 x density
Posterior marginals for x[1] for each theta (unweighted)
The R-INLA package The main idea
−14 −12 −10 −8 0.00 0.05 0.10 0.15 x density
Posterior marginals for x[1] for each theta (weighted)
The R-INLA package The main idea
−14 −12 −10 −8 0.0 0.2 0.4 0.6 0.8 x density
Posterior marginals for x[1]
The R-INLA package The main idea
Extensions
This is the basic idea behind INLA. It is really really simple. However, we need to extend this basic idea so we can deal with
- More than one hyperparameter
- Non-Gaussian observations
...the devil is in the details!
The R-INLA package The main idea
Extensions
This is the basic idea behind INLA. It is really really simple. However, we need to extend this basic idea so we can deal with
- More than one hyperparameter
- Non-Gaussian observations
...the devil is in the details!
The R-INLA package The main idea
Extensions
This is the basic idea behind INLA. It is really really simple. However, we need to extend this basic idea so we can deal with
- More than one hyperparameter
- Non-Gaussian observations
...the devil is in the details!
The R-INLA package The main idea
Extensions
This is the basic idea behind INLA. It is really really simple. However, we need to extend this basic idea so we can deal with
- More than one hyperparameter
- Non-Gaussian observations
...the devil is in the details!
The R-INLA package The main idea
Extensions
This is the basic idea behind INLA. It is really really simple. However, we need to extend this basic idea so we can deal with
- More than one hyperparameter
- Non-Gaussian observations
...the devil is in the details!
The R-INLA package The tools
The tools
- Precision matrices
- Sparse matrices/GMRFs/Markov
- Laplace approximations
The R-INLA package The tools
The tools
- Precision matrices
- Sparse matrices/GMRFs/Markov
- Laplace approximations
The R-INLA package The tools
The tools
- Precision matrices
- Sparse matrices/GMRFs/Markov
- Laplace approximations
The R-INLA package The tools Precision matrices
Hierarchical models
First layer x ∼ N(0, Qx) Second layer y|x ∼ N(x, Qy) Then Prec x y
- =
Qx + Qy −Qy −Qy Qy
- Very efficient: computational and storage
The R-INLA package The tools Precision matrices
Hierarchical models
First layer x ∼ N(0, Qx) Second layer y|x ∼ N(x, Qy) Then Prec x y
- =
Qx + Qy −Qy −Qy Qy
- Very efficient: computational and storage
The R-INLA package The tools Precision matrices
Hierarchical models
First layer x ∼ N(0, Qx) Second layer y|x ∼ N(x, Qy) Then Prec x y
- =
Qx + Qy −Qy −Qy Qy
- Very efficient: computational and storage
The R-INLA package The tools Sparse matrices
Sparse matrices/GMRFs/Markov
Conditional independence gives sparsity xi ⊥ xj | x−ij ⇐ ⇒ Qij = 0 In most cases, only O(n) of the n(n + 1)/2 elements in Q are non-zero.
The R-INLA package The tools Sparse matrices
Sparse matrices/GMRFs/Markov
Conditional independence gives sparsity xi ⊥ xj | x−ij ⇐ ⇒ Qij = 0 In most cases, only O(n) of the n(n + 1)/2 elements in Q are non-zero.
The R-INLA package The tools Sparse matrices
Example (I)
Auto-regressive model of order p xt = φ1xt−1 + · · · + φpxt−p + ǫt then Q is a band-matrix with band-width p
The R-INLA package The tools Sparse matrices
Example (I)
Auto-regressive model of order p xt = φ1xt−1 + · · · + φpxt−p + ǫt then Q is a band-matrix with band-width p
−0.5 0.0 0.5 1.0 1.5
The precision matrix
40 42 44 46 48 50
The covariance matrix
The R-INLA package The tools Sparse matrices
Example (II)
Gaussian models for areal data
2e+05 3e+05 4e+05 5e+05 6e+05 7e+05 8e+05 1200000 1300000 1400000 1500000 1600000 xylims$x xylims$y −1.0 −0.8 −0.6 −0.4 −0.2 0.0
The R-INLA package The tools Sparse matrices
Example (III)
Gaussian models on the sphere. (Have to “make” it Markov!)
The R-INLA package The tools Sparse matrices
Numerical methods for sparse matrices
- Only O(n) of the O(n2) terms are non-zero
- Computational costs (factorisation):
- O(n) in time
- O(n3/2) in space
- O(n2) in space×time
- Tasks
Q = LLT Qx = b log |Q(θ)| diag(Q−1)
The R-INLA package The tools Sparse matrices
Numerical methods for sparse matrices
- Only O(n) of the O(n2) terms are non-zero
- Computational costs (factorisation):
- O(n) in time
- O(n3/2) in space
- O(n2) in space×time
- Tasks
Q = LLT Qx = b log |Q(θ)| diag(Q−1)
The R-INLA package The tools Sparse matrices
Numerical methods for sparse matrices
- Only O(n) of the O(n2) terms are non-zero
- Computational costs (factorisation):
- O(n) in time
- O(n3/2) in space
- O(n2) in space×time
- Tasks
Q = LLT Qx = b log |Q(θ)| diag(Q−1)
The R-INLA package The tools Sparse matrices
Numerical methods for sparse matrices
- Only O(n) of the O(n2) terms are non-zero
- Computational costs (factorisation):
- O(n) in time
- O(n3/2) in space
- O(n2) in space×time
- Tasks
Q = LLT Qx = b log |Q(θ)| diag(Q−1)
The R-INLA package The tools Sparse matrices
Numerical methods for sparse matrices
- Only O(n) of the O(n2) terms are non-zero
- Computational costs (factorisation):
- O(n) in time
- O(n3/2) in space
- O(n2) in space×time
- Tasks
Q = LLT Qx = b log |Q(θ)| diag(Q−1)
The R-INLA package The tools Sparse matrices
Numerical methods for sparse matrices
- Only O(n) of the O(n2) terms are non-zero
- Computational costs (factorisation):
- O(n) in time
- O(n3/2) in space
- O(n2) in space×time
- Tasks
Q = LLT Qx = b log |Q(θ)| diag(Q−1)
The R-INLA package The tools The Laplace approximation
The Laplace approximation: The classic case
Compute and approximation to the integral
- exp(ng(x)) dx
where n is the parameter going to ∞. Let x0 be the mode of g(x) and assume g(x0) = 0: g(x) = 1 2g′′(x0)(x − x0)2 + · · · .
The R-INLA package The tools The Laplace approximation
The Laplace approximation: The classic case
Compute and approximation to the integral
- exp(ng(x)) dx
where n is the parameter going to ∞. Let x0 be the mode of g(x) and assume g(x0) = 0: g(x) = 1 2g′′(x0)(x − x0)2 + · · · .
The R-INLA package The tools The Laplace approximation
The Laplace approximation: The classic case...
Then
- exp(ng(x)) dx =
- 2π
n(−g′′(x0)) + · · · Error analysis gives Estimate(n) True = 1 + O(1/n) so the relative error is O(1/n).
The R-INLA package The tools The Laplace approximation
The Laplace approximation: The classic case...
Then
- exp(ng(x)) dx =
- 2π
n(−g′′(x0)) + · · · Error analysis gives Estimate(n) True = 1 + O(1/n) so the relative error is O(1/n).
The R-INLA package The tools The Laplace approximation
Errors in the approximations
Result:1 With n repeated measurements y of the same x, then
- π(θ|yn)
π(θ|yn) = 1 + O(n−3/2) after re-normalisation.
- The Relative error is a very very very nice property!
- The error-rate is impressive!
- Unfortunately, the assumptions made are usually not valid for
LGMs, but...
1Tierney & Kadane, JASA, 1986
The R-INLA package The tools The Laplace approximation
Errors in the approximations
Result:1 With n repeated measurements y of the same x, then
- π(θ|yn)
π(θ|yn) = 1 + O(n−3/2) after re-normalisation.
- The Relative error is a very very very nice property!
- The error-rate is impressive!
- Unfortunately, the assumptions made are usually not valid for
LGMs, but...
1Tierney & Kadane, JASA, 1986
The R-INLA package The tools The Laplace approximation
Errors in the approximations
Result:1 With n repeated measurements y of the same x, then
- π(θ|yn)
π(θ|yn) = 1 + O(n−3/2) after re-normalisation.
- The Relative error is a very very very nice property!
- The error-rate is impressive!
- Unfortunately, the assumptions made are usually not valid for
LGMs, but...
1Tierney & Kadane, JASA, 1986
The R-INLA package The tools The Laplace approximation
Errors in the approximations
Result:1 With n repeated measurements y of the same x, then
- π(θ|yn)
π(θ|yn) = 1 + O(n−3/2) after re-normalisation.
- The Relative error is a very very very nice property!
- The error-rate is impressive!
- Unfortunately, the assumptions made are usually not valid for
LGMs, but...
1Tierney & Kadane, JASA, 1986
The R-INLA package The R-INLA package
The R-INLA package
- A front end in R to
define LGMs and to do approximate Bayesian analysis using INLA
- The project is located
at www.r-inla.org
The R-INLA package The R-INLA package
The interface
result = inla(formula, data = data, family = family, ...) summary(result) plot(result) etc...
The R-INLA package The R-INLA package
Formula
The formula is mostly as “usual” y ~ 1 + x1 + x2 + x3:x4 + f(z1, model=...) + f(z2, model=...)
- LHS: the response
- RHS: the linear predictor
- “fixed effects”: x1, x2...
- “random effects” indexed by z1, z2
- f () define some Gaussian model!
The R-INLA package The R-INLA package
Formula
The formula is mostly as “usual” y ~ 1 + x1 + x2 + x3:x4 + f(z1, model=...) + f(z2, model=...)
- LHS: the response
- RHS: the linear predictor
- “fixed effects”: x1, x2...
- “random effects” indexed by z1, z2
- f () define some Gaussian model!
The R-INLA package The R-INLA package
Formula
The formula is mostly as “usual” y ~ 1 + x1 + x2 + x3:x4 + f(z1, model=...) + f(z2, model=...)
- LHS: the response
- RHS: the linear predictor
- “fixed effects”: x1, x2...
- “random effects” indexed by z1, z2
- f () define some Gaussian model!
The R-INLA package The R-INLA package
Formula
The formula is mostly as “usual” y ~ 1 + x1 + x2 + x3:x4 + f(z1, model=...) + f(z2, model=...)
- LHS: the response
- RHS: the linear predictor
- “fixed effects”: x1, x2...
- “random effects” indexed by z1, z2
- f () define some Gaussian model!
The R-INLA package The R-INLA package
Formula
The formula is mostly as “usual” y ~ 1 + x1 + x2 + x3:x4 + f(z1, model=...) + f(z2, model=...)
- LHS: the response
- RHS: the linear predictor
- “fixed effects”: x1, x2...
- “random effects” indexed by z1, z2
- f () define some Gaussian model!
The R-INLA package The R-INLA package
Special features
Although the “formula” framework is great, we needed to add new features to be able to more models Some of these are
- More than one family
- copy
- remote computing
The R-INLA package The R-INLA package
Special features
Although the “formula” framework is great, we needed to add new features to be able to more models Some of these are
- More than one family
- copy
- remote computing
The R-INLA package The R-INLA package
Special features
Although the “formula” framework is great, we needed to add new features to be able to more models Some of these are
- More than one family
- copy
- remote computing
The R-INLA package The R-INLA package
Special features
Although the “formula” framework is great, we needed to add new features to be able to more models Some of these are
- More than one family
- copy
- remote computing
The R-INLA package The R-INLA package
More than one family
Every observation could have its own likelihood. Need to make it easy.
- Response is a matrix or list
- Each “column” defines a separate “family”
- Each “family” has its own hyperparameters
The R-INLA package The R-INLA package
Simple example
> Y [,1] [,2] [1,] 1 NA [2,] 2 NA [3,] NA 3 [4,] NA 4 result = inla(Y ~ 1 + x, family = c("gaussian", "gaussian"), control.family = list(list(...), list(...)), data = list(Y=Y, x=x))
The R-INLA package The R-INLA package Feature: copy
Feature: copy
Fixes a limitation in the formula-formulation of the model The model formula = y ~ f(i, ...) + ... Only allow ONE element from each sub-model, to contribute to the linear predictor for each observation.
The R-INLA package The R-INLA package Feature: copy
Feature: copy
Fixes a limitation in the formula-formulation of the model The model formula = y ~ f(i, ...) + ... Only allow ONE element from each sub-model, to contribute to the linear predictor for each observation.
The R-INLA package The R-INLA package Feature: copy
Feature: copy
Suppose ηi = ui + ui+1 + ... Then we can code this as formula = y ~ f(i, model="iid") + f(i.plus, copy="i") + ...
- Create internally an additional sub-model which is ǫ-close to
the target
- Many copies allowed
- Weighted copies: ui + βui+1
The R-INLA package The R-INLA package Feature: copy
Feature: copy
Suppose ηi = ui + ui+1 + ... Then we can code this as formula = y ~ f(i, model="iid") + f(i.plus, copy="i") + ...
- Create internally an additional sub-model which is ǫ-close to
the target
- Many copies allowed
- Weighted copies: ui + βui+1
The R-INLA package The R-INLA package Feature: copy
(Classical) Measurement error-model using copy
yi ∼ ηi = . . . + βxi + . . . where x is unknown but observed as x′, where f.ex. x′ = x + ǫ
The R-INLA package The R-INLA package Feature: copy
Example 1
y ∼ N(µ + βx, τ) xobs ∼ N(x, κ) Write this a a joint model y xobs
- =
µ
- +
βx
- +
x
- +
- 1
√τ ǫ
- +
- 1
√κ
ǫ
- using two families + copy
The R-INLA package The R-INLA package Feature: copy
Example 1
y ∼ N(µ + βx, τ) xobs ∼ N(x, κ) Write this a a joint model y xobs
- =
µ
- +
βx
- +
x
- +
- 1
√τ ǫ
- +
- 1
√κ
ǫ
- using two families + copy
The R-INLA package The R-INLA package Feature: copy
Example 2
y ∼ N(βx, τ) xobs ∼ Poisson(exp(x))
The R-INLA package The R-INLA package Feature: copy
Example 3: Preferencial sampling
0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.2 0.4 0.6 0.8 1.0 1 2 3 4 5 6 7
Preferential sampling
0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.2 0.4 0.6 0.8 1.0 1 2 3 4 5 6 7
Random sampling
The R-INLA package The R-INLA package Feature: remote
Feature: remote computing
For large/huge models, it is convenient to run the computations (only!) on a remote (Linux/Mac) computational server inla(...., inla.call="remote")
- The computations are done in a separate program outside R
system("inlaprogram INPUT OUTPUT")
- It is straight forward to implement (using Cygwin on
Windows) scp INPUT $host: ssh $host inlaprogram INPUT OUTPUT scp $host:OUTPUT .
- Really make use of a computational server
The R-INLA package The R-INLA package Feature: remote
Feature: remote computing
For large/huge models, it is convenient to run the computations (only!) on a remote (Linux/Mac) computational server inla(...., inla.call="remote")
- The computations are done in a separate program outside R
system("inlaprogram INPUT OUTPUT")
- It is straight forward to implement (using Cygwin on
Windows) scp INPUT $host: ssh $host inlaprogram INPUT OUTPUT scp $host:OUTPUT .
- Really make use of a computational server
The R-INLA package The R-INLA package Feature: remote
Feature: remote computing
For large/huge models, it is convenient to run the computations (only!) on a remote (Linux/Mac) computational server inla(...., inla.call="remote")
- The computations are done in a separate program outside R
system("inlaprogram INPUT OUTPUT")
- It is straight forward to implement (using Cygwin on
Windows) scp INPUT $host: ssh $host inlaprogram INPUT OUTPUT scp $host:OUTPUT .
- Really make use of a computational server
The R-INLA package The R-INLA package Feature: remote
Feature: remote computing
For large/huge models, it is convenient to run the computations (only!) on a remote (Linux/Mac) computational server inla(...., inla.call="remote")
- The computations are done in a separate program outside R
system("inlaprogram INPUT OUTPUT")
- It is straight forward to implement (using Cygwin on
Windows) scp INPUT $host: ssh $host inlaprogram INPUT OUTPUT scp $host:OUTPUT .
- Really make use of a computational server
The R-INLA package Road head Version 1
The road ahead-1.0
Current code
- Current code is > 100 000 lines of R/C/C++ lines
- Written in parallel with the development...
- Some of the internal design could have been better
- Parts of the code are not easy accessible for others
- Would be nice with a rewrite...
The R-INLA package Road head Version 1
The road ahead-1.0
Current code
- Current code is > 100 000 lines of R/C/C++ lines
- Written in parallel with the development...
- Some of the internal design could have been better
- Parts of the code are not easy accessible for others
- Would be nice with a rewrite...
The R-INLA package Road head Version 1
The road ahead-1.0
Current code
- Current code is > 100 000 lines of R/C/C++ lines
- Written in parallel with the development...
- Some of the internal design could have been better
- Parts of the code are not easy accessible for others
- Would be nice with a rewrite...
The R-INLA package Road head Version 1
The road ahead-1.0
Current code
- Current code is > 100 000 lines of R/C/C++ lines
- Written in parallel with the development...
- Some of the internal design could have been better
- Parts of the code are not easy accessible for others
- Would be nice with a rewrite...
The R-INLA package Road head Version 1
The road ahead-1.0
Current code
- Current code is > 100 000 lines of R/C/C++ lines
- Written in parallel with the development...
- Some of the internal design could have been better
- Parts of the code are not easy accessible for others
- Would be nice with a rewrite...
The R-INLA package Road head Version 1
The road ahead Version 1
However, there are exciting development within the project as well
- Non-stationary Gaussian fields using SPDE’s
- Non-separable space-time Gaussian fields using SPDE’s
- Flexible-models (!!!)
- Add-on and overlay packages
- Package for constructing credible regions for contours and
excursions from INLA-output (Bolin & Lindgren)
- Extending the R-INLA Package for Spatial Statistics (Bivand
& G´
- mez-Rubio)
- Package for additive genetic models/pedigree based models
(Holand & Steinsland)
- Package for multiple shrinkage priors with applications to RNA
sequencing (Van der Wiel)
The R-INLA package Road head Version 1
The road ahead Version 1
However, there are exciting development within the project as well
- Non-stationary Gaussian fields using SPDE’s
- Non-separable space-time Gaussian fields using SPDE’s
- Flexible-models (!!!)
- Add-on and overlay packages
- Package for constructing credible regions for contours and
excursions from INLA-output (Bolin & Lindgren)
- Extending the R-INLA Package for Spatial Statistics (Bivand
& G´
- mez-Rubio)
- Package for additive genetic models/pedigree based models
(Holand & Steinsland)
- Package for multiple shrinkage priors with applications to RNA
sequencing (Van der Wiel)
The R-INLA package Road head Version 1
The road ahead Version 1
However, there are exciting development within the project as well
- Non-stationary Gaussian fields using SPDE’s
- Non-separable space-time Gaussian fields using SPDE’s
- Flexible-models (!!!)
- Add-on and overlay packages
- Package for constructing credible regions for contours and
excursions from INLA-output (Bolin & Lindgren)
- Extending the R-INLA Package for Spatial Statistics (Bivand
& G´
- mez-Rubio)
- Package for additive genetic models/pedigree based models
(Holand & Steinsland)
- Package for multiple shrinkage priors with applications to RNA
sequencing (Van der Wiel)
The R-INLA package Road head Version 1
The road ahead Version 1
However, there are exciting development within the project as well
- Non-stationary Gaussian fields using SPDE’s
- Non-separable space-time Gaussian fields using SPDE’s
- Flexible-models (!!!)
- Add-on and overlay packages
- Package for constructing credible regions for contours and
excursions from INLA-output (Bolin & Lindgren)
- Extending the R-INLA Package for Spatial Statistics (Bivand
& G´
- mez-Rubio)
- Package for additive genetic models/pedigree based models
(Holand & Steinsland)
- Package for multiple shrinkage priors with applications to RNA
sequencing (Van der Wiel)
The R-INLA package Road head Version 1
The road ahead Version 1
However, there are exciting development within the project as well
- Non-stationary Gaussian fields using SPDE’s
- Non-separable space-time Gaussian fields using SPDE’s
- Flexible-models (!!!)
- Add-on and overlay packages
- Package for constructing credible regions for contours and
excursions from INLA-output (Bolin & Lindgren)
- Extending the R-INLA Package for Spatial Statistics (Bivand
& G´
- mez-Rubio)
- Package for additive genetic models/pedigree based models
(Holand & Steinsland)
- Package for multiple shrinkage priors with applications to RNA
sequencing (Van der Wiel)
The R-INLA package Road head Version 1
The road ahead Version 1
However, there are exciting development within the project as well
- Non-stationary Gaussian fields using SPDE’s
- Non-separable space-time Gaussian fields using SPDE’s
- Flexible-models (!!!)
- Add-on and overlay packages
- Package for constructing credible regions for contours and
excursions from INLA-output (Bolin & Lindgren)
- Extending the R-INLA Package for Spatial Statistics (Bivand
& G´
- mez-Rubio)
- Package for additive genetic models/pedigree based models
(Holand & Steinsland)
- Package for multiple shrinkage priors with applications to RNA
sequencing (Van der Wiel)
The R-INLA package Road head Version 1
The road ahead Version 1
However, there are exciting development within the project as well
- Non-stationary Gaussian fields using SPDE’s
- Non-separable space-time Gaussian fields using SPDE’s
- Flexible-models (!!!)
- Add-on and overlay packages
- Package for constructing credible regions for contours and
excursions from INLA-output (Bolin & Lindgren)
- Extending the R-INLA Package for Spatial Statistics (Bivand
& G´
- mez-Rubio)
- Package for additive genetic models/pedigree based models
(Holand & Steinsland)
- Package for multiple shrinkage priors with applications to RNA
sequencing (Van der Wiel)
The R-INLA package Road head Version 1
The road ahead Version 1
However, there are exciting development within the project as well
- Non-stationary Gaussian fields using SPDE’s
- Non-separable space-time Gaussian fields using SPDE’s
- Flexible-models (!!!)
- Add-on and overlay packages
- Package for constructing credible regions for contours and
excursions from INLA-output (Bolin & Lindgren)
- Extending the R-INLA Package for Spatial Statistics (Bivand
& G´
- mez-Rubio)
- Package for additive genetic models/pedigree based models
(Holand & Steinsland)
- Package for multiple shrinkage priors with applications to RNA
sequencing (Van der Wiel)
The R-INLA package Road head Version 1
The road ahead Version 1
- Current version supports a lot of LGMs
- There are still a number of models which we should be able to
do but cannot
- The critical assumption in our definition of LGMs, is
π(u | x, θ) =
- i
π(yi | ηi, θ)
- In words, one observation depends only on one linear predictor.
The R-INLA package Road head Version 1
The road ahead Version 1
- Current version supports a lot of LGMs
- There are still a number of models which we should be able to
do but cannot
- The critical assumption in our definition of LGMs, is
π(u | x, θ) =
- i
π(yi | ηi, θ)
- In words, one observation depends only on one linear predictor.
The R-INLA package Road head Version 1
The road ahead Version 1
- Current version supports a lot of LGMs
- There are still a number of models which we should be able to
do but cannot
- The critical assumption in our definition of LGMs, is
π(u | x, θ) =
- i
π(yi | ηi, θ)
- In words, one observation depends only on one linear predictor.
The R-INLA package Road head Version 1
The road ahead Version 1
- Current version supports a lot of LGMs
- There are still a number of models which we should be able to
do but cannot
- The critical assumption in our definition of LGMs, is
π(u | x, θ) =
- i
π(yi | ηi, θ)
- In words, one observation depends only on one linear predictor.
The R-INLA package Road head Version 1
The road ahead Version 1
- We can “circumvent” this assumption, introducing a second
layer of linear predictors in the latent field η∗ = Aη, fixed matrix A where π(y | x, θ) =
- i
π(yi | η∗
i , θ)
- This is implemented as
r = inla(formula, control.predictor = list(A=A), ...)
- Although this is is a very useful feature, it does not solve the
underlying problem
The R-INLA package Road head Version 1
The road ahead Version 1
- We can “circumvent” this assumption, introducing a second
layer of linear predictors in the latent field η∗ = Aη, fixed matrix A where π(y | x, θ) =
- i
π(yi | η∗
i , θ)
- This is implemented as
r = inla(formula, control.predictor = list(A=A), ...)
- Although this is is a very useful feature, it does not solve the
underlying problem
The R-INLA package Road head Version 1
The road ahead Version 1
- We can “circumvent” this assumption, introducing a second
layer of linear predictors in the latent field η∗ = Aη, fixed matrix A where π(y | x, θ) =
- i
π(yi | η∗
i , θ)
- This is implemented as
r = inla(formula, control.predictor = list(A=A), ...)
- Although this is is a very useful feature, it does not solve the
underlying problem
The R-INLA package Road head Version 2
The road ahead Version 2
- What is needed is a more wide definition of LGMs, where the
- bservations enters slightly differently
π(y | x, θ) =
- i
π(yi | {ηj, j ∈ Si}, θ)
- In most cases |Si| ∈ {1, 2, 3}, but not always
- This is DOABLE extension of the current INLA-algorithm!
- Require a rewrite of the code
The R-INLA package Road head Version 2
The road ahead Version 2
- What is needed is a more wide definition of LGMs, where the
- bservations enters slightly differently
π(y | x, θ) =
- i
π(yi | {ηj, j ∈ Si}, θ)
- In most cases |Si| ∈ {1, 2, 3}, but not always
- This is DOABLE extension of the current INLA-algorithm!
- Require a rewrite of the code
The R-INLA package Road head Version 2
The road ahead Version 2
- What is needed is a more wide definition of LGMs, where the
- bservations enters slightly differently
π(y | x, θ) =
- i
π(yi | {ηj, j ∈ Si}, θ)
- In most cases |Si| ∈ {1, 2, 3}, but not always
- This is DOABLE extension of the current INLA-algorithm!
- Require a rewrite of the code
The R-INLA package Road head Version 2
The road ahead Version 2
- What is needed is a more wide definition of LGMs, where the
- bservations enters slightly differently
π(y | x, θ) =
- i
π(yi | {ηj, j ∈ Si}, θ)
- In most cases |Si| ∈ {1, 2, 3}, but not always
- This is DOABLE extension of the current INLA-algorithm!
- Require a rewrite of the code
The R-INLA package Road head Version 2
The road ahead Version 2
With this extension we can f.ex do LGMs with
- Zero-inflated likelihoods, where the excess probability for zero,
depends on its own linear predictor
- Likelihoods with overdispersion, where also the overdispersion
depends on a linear predictor
- capture-recapture and distance sampling models
- more general survival models (censoring)
- improved models for aggregated count-data
- +++
The R-INLA package Road head Version 2
The road ahead Version 2
With this extension we can f.ex do LGMs with
- Zero-inflated likelihoods, where the excess probability for zero,
depends on its own linear predictor
- Likelihoods with overdispersion, where also the overdispersion
depends on a linear predictor
- capture-recapture and distance sampling models
- more general survival models (censoring)
- improved models for aggregated count-data
- +++
The R-INLA package Road head Version 2
The road ahead Version 2
With this extension we can f.ex do LGMs with
- Zero-inflated likelihoods, where the excess probability for zero,
depends on its own linear predictor
- Likelihoods with overdispersion, where also the overdispersion
depends on a linear predictor
- capture-recapture and distance sampling models
- more general survival models (censoring)
- improved models for aggregated count-data
- +++
The R-INLA package Road head Version 2
The road ahead Version 2
With this extension we can f.ex do LGMs with
- Zero-inflated likelihoods, where the excess probability for zero,
depends on its own linear predictor
- Likelihoods with overdispersion, where also the overdispersion
depends on a linear predictor
- capture-recapture and distance sampling models
- more general survival models (censoring)
- improved models for aggregated count-data
- +++
The R-INLA package Road head Version 2
The road ahead Version 2
With this extension we can f.ex do LGMs with
- Zero-inflated likelihoods, where the excess probability for zero,
depends on its own linear predictor
- Likelihoods with overdispersion, where also the overdispersion
depends on a linear predictor
- capture-recapture and distance sampling models
- more general survival models (censoring)
- improved models for aggregated count-data
- +++
The R-INLA package Road head Version 2
The road ahead Version 2
With this extension we can f.ex do LGMs with
- Zero-inflated likelihoods, where the excess probability for zero,
depends on its own linear predictor
- Likelihoods with overdispersion, where also the overdispersion
depends on a linear predictor
- capture-recapture and distance sampling models
- more general survival models (censoring)
- improved models for aggregated count-data
- +++
The R-INLA package Road head Version 2
The road ahead Version 2
- Require an extended interface from within R; many formulas
- Each likelihood model may f.ex accept a linear predictor at
several places
- My HOPE is that “SOME” will do this well...
- If so, it would be the most used Bayesian software! (My view)
The R-INLA package Road head Version 2
The road ahead Version 2
- Require an extended interface from within R; many formulas
- Each likelihood model may f.ex accept a linear predictor at
several places
- My HOPE is that “SOME” will do this well...
- If so, it would be the most used Bayesian software! (My view)
The R-INLA package Road head Version 2
The road ahead Version 2
- Require an extended interface from within R; many formulas
- Each likelihood model may f.ex accept a linear predictor at
several places
- My HOPE is that “SOME” will do this well...
- If so, it would be the most used Bayesian software! (My view)
The R-INLA package Road head Version 2
The road ahead Version 2
- Require an extended interface from within R; many formulas
- Each likelihood model may f.ex accept a linear predictor at
several places
- My HOPE is that “SOME” will do this well...
- If so, it would be the most used Bayesian software! (My view)
The R-INLA package Discussion
Discussion
- Most statistical models in use today, are of LGM1/LGM2 type
- GMRF-models are important, speed/memory
- The INLA approach seems to be sufficiently accurate in near
all cases. Can be improved.
- Some exciting developments going on
- An INLA2 for the extended class LGM2, is doable and will be
hugely important for Bayesian computing
- If anyone is interested in doing INLA2, let me know...
The R-INLA package Discussion
Discussion
- Most statistical models in use today, are of LGM1/LGM2 type
- GMRF-models are important, speed/memory
- The INLA approach seems to be sufficiently accurate in near
all cases. Can be improved.
- Some exciting developments going on
- An INLA2 for the extended class LGM2, is doable and will be
hugely important for Bayesian computing
- If anyone is interested in doing INLA2, let me know...
The R-INLA package Discussion
Discussion
- Most statistical models in use today, are of LGM1/LGM2 type
- GMRF-models are important, speed/memory
- The INLA approach seems to be sufficiently accurate in near
all cases. Can be improved.
- Some exciting developments going on
- An INLA2 for the extended class LGM2, is doable and will be
hugely important for Bayesian computing
- If anyone is interested in doing INLA2, let me know...
The R-INLA package Discussion
Discussion
- Most statistical models in use today, are of LGM1/LGM2 type
- GMRF-models are important, speed/memory
- The INLA approach seems to be sufficiently accurate in near
all cases. Can be improved.
- Some exciting developments going on
- An INLA2 for the extended class LGM2, is doable and will be
hugely important for Bayesian computing
- If anyone is interested in doing INLA2, let me know...
The R-INLA package Discussion
Discussion
- Most statistical models in use today, are of LGM1/LGM2 type
- GMRF-models are important, speed/memory
- The INLA approach seems to be sufficiently accurate in near
all cases. Can be improved.
- Some exciting developments going on
- An INLA2 for the extended class LGM2, is doable and will be
hugely important for Bayesian computing
- If anyone is interested in doing INLA2, let me know...
The R-INLA package Discussion
Discussion
- Most statistical models in use today, are of LGM1/LGM2 type
- GMRF-models are important, speed/memory
- The INLA approach seems to be sufficiently accurate in near
all cases. Can be improved.
- Some exciting developments going on
- An INLA2 for the extended class LGM2, is doable and will be
hugely important for Bayesian computing
- If anyone is interested in doing INLA2, let me know...