Sequential Point Process Model and Bayesian Inference for Spatial - - PowerPoint PPT Presentation

sequential point process model and bayesian inference for
SMART_READER_LITE
LIVE PREVIEW

Sequential Point Process Model and Bayesian Inference for Spatial - - PowerPoint PPT Presentation

Sequential Point Process Model and Bayesian Inference for Spatial Point Patterns with Linear Structures Jakob G. Rasmussen Joint work with Jesper Mller Department of Mathematical Sciences Aalborg University Denmark 1 / 19 Outline Data


slide-1
SLIDE 1

Sequential Point Process Model and Bayesian Inference for Spatial Point Patterns with Linear Structures

Jakob G. Rasmussen

Joint work with Jesper Møller Department of Mathematical Sciences Aalborg University Denmark

1 / 19

slide-2
SLIDE 2

Outline

Data Dataset 1: Barrows Dataset 2: Mountain tops Model Model construction Simulation algorithm Inference Bayesian model: likelihood and priors MCMC based parameter estimation

2 / 19

slide-3
SLIDE 3

Dataset 1: Barrows

◮ A barrow is a bronze age

burial site resembling a small hill.

◮ These are important sources

  • f information for

archaologists.

◮ They are often placed

roughly in linear structures.

3 / 19

slide-4
SLIDE 4

Dataset 2: Mountain tops

◮ Mountains ridges means

that “local” tops are often forming linear structures.

4 / 19

slide-5
SLIDE 5

Linear structures

◮ In this talk we will consider a model capable of generating

linear formations.

◮ Roughly speaking, this model generates linear structures by

moving points closer to other points.

◮ Interpretation of the model:

◮ Barrows: Here the model is interpreted as dead people are

buried close to previously buried people.

◮ Mountains: No reasonable interpretation - the model should

not be thought of as representing actual mechanics.

5 / 19

slide-6
SLIDE 6

Outline

Data Dataset 1: Barrows Dataset 2: Mountain tops Model Model construction Simulation algorithm Inference Bayesian model: likelihood and priors MCMC based parameter estimation

6 / 19

slide-7
SLIDE 7

Model construction

◮ Point process x defined on window W . ◮ x = xc ∪ xb with n points. ◮ Number of points in xc, k, is binom(n, q). ◮ Background process:

◮ xb consists of i.i.d. uniformly distributed points on W

◮ Cluster process:

◮ Sequential construction. ◮ A point is initially uniformly distributed independently of

everything else.

◮ With probability p this point is moved closer to the closest

previous point; otherwise it keeps its original position.

7 / 19

slide-8
SLIDE 8

Voronoi tesselations

0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.2 0.4 0.6 0.8 1.0

◮ Voronoi tesselation: an area is associated with the closest

point in the point process

8 / 19

slide-9
SLIDE 9

Moving points

x1 x2 x3 x4 y4

◮ Density for new position:

h(xi|{x1, . . . , xi−1}; σ2) ∝ exp(−r2

i /(2σ2)),

0 < ri < li

◮ Other distributions have also been tried.

9 / 19

slide-10
SLIDE 10

Simulation algorithm

Fix number of points, and for each point do the following:

  • 1. Find type of point i (cluster with prob. q, background
  • therwise)
  • 2. If background point:

2.1 Find coordinates - uniformly distributed on W

  • 3. If cluster point:

3.1 Find initial coordinates - uniformly distributed on W 3.2 Move with probability p, otherwise keep position 3.3 If move, find closest cluster point and move new point closer to this using Exp-distribution

10 / 19

slide-11
SLIDE 11

Two simulations

q = 0.95, p = 0.95, σ = 70 q = 0.90, p = 0.90, σ = 130 Remember:

◮ q is the probability that a point is a cluster point (i.e. belongs

to a linear structure)

◮ p is the probability is that a cluster point is moved (i.e.

continues an existing linear structure)

◮ σ governs how close points in lines are located to each other

11 / 19

slide-12
SLIDE 12

Outline

Data Dataset 1: Barrows Dataset 2: Mountain tops Model Model construction Simulation algorithm Inference Bayesian model: likelihood and priors MCMC based parameter estimation

12 / 19

slide-13
SLIDE 13

Likelihood and priors

◮ Let z be the observed point pattern x including type

(cluster/background) and order of points.

◮ Likelihood:

L(q, p, σ2|z) = n k

  • qk

1 − q |W | n−k

k

  • i=1

f (xi|x1, . . . , xi−1; p, σ2) where f (·|x1, . . . , xi−1; p, σ2) = p×h(·|{x1, . . . , xi−1}; σ2)+(1−p)× 1 |W |

◮ Priors:

◮ Independent priors for p, q, σ. ◮ p, q: Uniform on [0, 1]. ◮ σ: Flat inverse gamma or (improper) uniform on [0, ∞). 13 / 19

slide-14
SLIDE 14

Estimation of parameters

◮ Ideally we would find mean/maximum of the posterior

p(q, p, σ2|x) ∝ g1(p)g2(q)g3(σ2)L(q, p, σ2|x) but we only have closed form expression for L(q, p, σ2|z), not L(q, p, σ2|x).

◮ So we have a missing data problem:

◮ The order of xc = {x1, . . . , xk} is unknown. ◮ Also it is unknown whether a point belongs to xc or xb.

◮ So we need to approximate the estimates of p, q, σ and the

missing data by Markov chain Monte Carlo.

14 / 19

slide-15
SLIDE 15

MCMC

◮ We use Metropolis within Gibbs to obtain posterior. ◮ Updates:

◮ A background point becomes a cluster point. ◮ A cluster point becomes a background point. ◮ Shifting the ordering of two succeeding cluster points. ◮ Parameters p, q and σ2: Metropolis update, normal proposal.

◮ Hastings ratios are easily obtained. ◮ Calculation times are not too bad. ◮ Mixing is fair.

15 / 19

slide-16
SLIDE 16

Posterior distributions - parameters

Barrows:

q 0.65 0.75 0.85 2 4 6 8 10 12 14 p 0.65 0.75 5 10 15 σ 55 65 75 0.00 0.04 0.08 0.12

Mountains:

q 0.5 0.6 0.7 0.8 0.9 1.0 1 2 3 4 5 p 0.6 0.7 0.8 0.9 1.0 2 4 6 σ 200 250 300 350 400 0.000 0.005 0.010 0.015

16 / 19

slide-17
SLIDE 17

Posterior distributions - missing data

Circle radius indicates marginal posterior probability of a point being a cluster point.

17 / 19

slide-18
SLIDE 18

Posterior distributions - missing data

Circle radius indicates the order in which the cluster points occur.

18 / 19

slide-19
SLIDE 19

Concluding remarks

◮ Summing up: a new model with linear structures and

MCMC-based Bayesian inference

◮ Model checking skipped in this talk ◮ Many extensions/modifications possible, e.g. inclusion of

covariates, initial placements or moving mechanims Thank you for your attention :-)

19 / 19