sequential point process model and bayesian inference for
play

Sequential Point Process Model and Bayesian Inference for Spatial - PowerPoint PPT Presentation

Sequential Point Process Model and Bayesian Inference for Spatial Point Patterns with Linear Structures Jakob G. Rasmussen Joint work with Jesper Mller Department of Mathematical Sciences Aalborg University Denmark 1 / 19 Outline Data


  1. Sequential Point Process Model and Bayesian Inference for Spatial Point Patterns with Linear Structures Jakob G. Rasmussen Joint work with Jesper Møller Department of Mathematical Sciences Aalborg University Denmark 1 / 19

  2. Outline Data Dataset 1: Barrows Dataset 2: Mountain tops Model Model construction Simulation algorithm Inference Bayesian model: likelihood and priors MCMC based parameter estimation 2 / 19

  3. Dataset 1: Barrows ◮ A barrow is a bronze age burial site resembling a small hill. ◮ These are important sources of information for archaologists. ◮ They are often placed roughly in linear structures. 3 / 19

  4. Dataset 2: Mountain tops ◮ Mountains ridges means that “local” tops are often forming linear structures. 4 / 19

  5. Linear structures ◮ In this talk we will consider a model capable of generating linear formations. ◮ Roughly speaking, this model generates linear structures by moving points closer to other points. ◮ Interpretation of the model: ◮ Barrows: Here the model is interpreted as dead people are buried close to previously buried people. ◮ Mountains: No reasonable interpretation - the model should not be thought of as representing actual mechanics. 5 / 19

  6. Outline Data Dataset 1: Barrows Dataset 2: Mountain tops Model Model construction Simulation algorithm Inference Bayesian model: likelihood and priors MCMC based parameter estimation 6 / 19

  7. Model construction ◮ Point process x defined on window W . ◮ x = x c ∪ x b with n points. ◮ Number of points in x c , k , is binom( n , q ). ◮ Background process: ◮ x b consists of i.i.d. uniformly distributed points on W ◮ Cluster process: ◮ Sequential construction. ◮ A point is initially uniformly distributed independently of everything else. ◮ With probability p this point is moved closer to the closest previous point; otherwise it keeps its original position. 7 / 19

  8. Voronoi tesselations 1.0 0.8 0.6 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0 ◮ Voronoi tesselation: an area is associated with the closest point in the point process 8 / 19

  9. Moving points x 3 x 4 y 4 x 2 x 1 ◮ Density for new position: h ( x i |{ x 1 , . . . , x i − 1 } ; σ 2 ) ∝ exp( − r 2 i / (2 σ 2 )) , 0 < r i < l i ◮ Other distributions have also been tried. 9 / 19

  10. Simulation algorithm Fix number of points, and for each point do the following: 1. Find type of point i (cluster with prob. q , background otherwise) 2. If background point: 2.1 Find coordinates - uniformly distributed on W 3. If cluster point: 3.1 Find initial coordinates - uniformly distributed on W 3.2 Move with probability p , otherwise keep position 3.3 If move, find closest cluster point and move new point closer to this using Exp-distribution 10 / 19

  11. Two simulations q = 0 . 95 , p = 0 . 95 , σ = 70 q = 0 . 90 , p = 0 . 90 , σ = 130 Remember: ◮ q is the probability that a point is a cluster point (i.e. belongs to a linear structure) ◮ p is the probability is that a cluster point is moved (i.e. continues an existing linear structure) ◮ σ governs how close points in lines are located to each other 11 / 19

  12. Outline Data Dataset 1: Barrows Dataset 2: Mountain tops Model Model construction Simulation algorithm Inference Bayesian model: likelihood and priors MCMC based parameter estimation 12 / 19

  13. Likelihood and priors ◮ Let z be the observed point pattern x including type (cluster/background) and order of points. ◮ Likelihood: k � n − k � n � � 1 − q L ( q , p , σ 2 | z ) = � f ( x i | x 1 , . . . , x i − 1 ; p , σ 2 ) q k k | W | i =1 where f ( ·| x 1 , . . . , x i − 1 ; p , σ 2 ) = p × h ( ·|{ x 1 , . . . , x i − 1 } ; σ 2 )+(1 − p ) × 1 | W | ◮ Priors: ◮ Independent priors for p , q , σ . ◮ p , q : Uniform on [0 , 1]. ◮ σ : Flat inverse gamma or (improper) uniform on [0 , ∞ ). 13 / 19

  14. Estimation of parameters ◮ Ideally we would find mean/maximum of the posterior p ( q , p , σ 2 | x ) ∝ g 1 ( p ) g 2 ( q ) g 3 ( σ 2 ) L ( q , p , σ 2 | x ) but we only have closed form expression for L ( q , p , σ 2 | z ), not L ( q , p , σ 2 | x ). ◮ So we have a missing data problem: ◮ The order of x c = { x 1 , . . . , x k } is unknown. ◮ Also it is unknown whether a point belongs to x c or x b . ◮ So we need to approximate the estimates of p , q , σ and the missing data by Markov chain Monte Carlo. 14 / 19

  15. MCMC ◮ We use Metropolis within Gibbs to obtain posterior. ◮ Updates: ◮ A background point becomes a cluster point. ◮ A cluster point becomes a background point. ◮ Shifting the ordering of two succeeding cluster points. ◮ Parameters p , q and σ 2 : Metropolis update, normal proposal. ◮ Hastings ratios are easily obtained. ◮ Calculation times are not too bad. ◮ Mixing is fair. 15 / 19

  16. Posterior distributions - parameters Barrows: 14 15 0.12 12 10 10 0.08 8 6 0.04 5 4 2 0.00 0 0 0.65 0.75 0.85 0.65 0.75 55 65 75 q p σ Mountains: 0.015 5 6 4 0.010 4 3 0.005 2 2 1 0.000 0 0 0.5 0.6 0.7 0.8 0.9 1.0 0.6 0.7 0.8 0.9 1.0 200 250 300 350 400 q p σ 16 / 19

  17. Posterior distributions - missing data Circle radius indicates marginal posterior probability of a point being a cluster point. 17 / 19

  18. Posterior distributions - missing data Circle radius indicates the order in which the cluster points occur. 18 / 19

  19. Concluding remarks ◮ Summing up: a new model with linear structures and MCMC-based Bayesian inference ◮ Model checking skipped in this talk ◮ Many extensions/modifications possible, e.g. inclusion of covariates, initial placements or moving mechanims Thank you for your attention :-) 19 / 19

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend