the software package a nts infields a nts infields is a
play

The Software Package A NTS InFields A NTS InFields is a Software - PowerPoint PPT Presentation

National Research Center Institute of Biomathematics for Environment & Health & Biometry Mathematical Modelling in Ecology and the Biosciences Stochastic Simulation and Bayesian Inference for Gibbs Fields: The Software-Package A NTS


  1. National Research Center Institute of Biomathematics for Environment & Health & Biometry Mathematical Modelling in Ecology and the Biosciences Stochastic Simulation and Bayesian Inference for Gibbs Fields: The Software-Package A NTS InFields Felix Friedrich Leipzig, 6.12.2002 ◭ ◮

  2. The Software Package A NTS InFields A NTS InFields is a Software Package for Simulation and Statistical Inference on Gibbs Fields. It is intended for mainly two purposes: To support tea- ching by demonstrating well known sampling and estimation techniques and for assistance in research. A NTS InFields is available for download from http://www.antsinfields.de contact: friedrich@antsinfields.de ◭ ◮

  3. History 1995–1998: Development of Voyager Portable and extensible System for simulation and data analysis 1998: Diploma thesis Parameter estimation on Gibbs fields in the context of statistical Image Analysis Need for implementation of • samplers for various Gibbs Fields (Ising Model and extensions) • parameter estimators on simulated or external data. today: A NTS InFields Software for Simulation of and Statistical Inference on Gibbs Fields ◭ ◮

  4. Aims / Requirements Aims • support teaching • assistance for research Requirements • (really) easy to handle • interactive visualization turned out to be efficient tool for teaching • flexibility for research, testing new techniques etc. • extensibility for implementing new samplers etc. ◭ ◮

  5. Realization • Strongly object oriented concept allows implementation close to mathematical structure intuitive and self-explaining easy implementation of interaction and consistent visualization • modular design for extensibility, reusability • command language for flexibility on intermediate level A NTS InFields is written in Oberon System 3 (ETH Z¨ urich, N.Wirth, Gutknecht, H.Marais, E.Zeller et al.) Oberon is also an Operating System. It runs on bare PC Hardware or as Emulation on Windows, MacOS, Linux,. . . (Portability) A NTS InFields uses the Voyager extension (University of Heidelberg, G.Sawitzki, M.Diller, F.Friedrich et al.) ◭ ◮

  6. Scope A NTS InFields contains • Handling and visualization of 1D, 2D and 3D data • Gibbs and Metropolis Hastings Algorithms, Simulated Annealing, Exact Sampling (CFTP) • Bayesian image reconstruction methods • parameter estimators on Gibbs fields • . . . A NTS InFields is attached to 2nd Edition of G. Winklers Book ‘Image Analyis, Random Fields and Dynamic Monte Carlo Methods’, Sprin- ger Verlag In progress: Meta compiler (alpha) for easy implementing of new models Planned: Interface to R ( both command language and procedure calls) . Demonstration: Look and feel, Commands, Panels, Random Numbers ◭ ◮

  7. Bayesian Image Restoration ◭ ◮

  8. Random Fields Notation {• , •} E finite space of states S ⊂ Z d finite index set x = ( x s ) s ∈ S ∈ E S configuration X := E S space of configurations { } , . . . , , . . . , , . . . , ◭ ◮

  9. Gibbs Fields We consider Neighbour-Gibbs fields exp( − H ( x )) Π( x ) = � y ∈ X exp( − H ( y )) , where H is of the form � H ( x ) = f ( x s , x ∂ ( s ) ) s ∈ S with ∂ ( t ) some neighbourhood of t , t ∈ S . ∂ = ◭ ◮

  10. Ising Model Easiest nontrivial case: E = {− 1 , 1 } , nearest neighbours and isotropy. An Ising Model with parameters β, h ∈ R is a Gibbs-Field with energy � � H ( x ) = − β x s x t + h x s s ∼ t s h : global tendency to take value 1 β : tendency of neighbours to be alike. ◭ ◮

  11. Problems with sampling: exp( − H ( x )) P ( X = x ) = � y ∈ X exp( − H ( y )) untractable , but P ( X t = x t | X s = x s , s � = t ) = exp( x t ( h + β � s∂t x s )) cosh( h + β � easy to calculate s∂t x s ) Solution → MCMC techniques like • Gibbs Sampler • Metropolis Hastings Algorithm • Exact Sampling ◭ ◮

  12. Markov Chains An irreducible Markov Chain with a stationary distribution µ is ergodic and fulfills (a) t →∞ P t ( i, j ) − → µ ( j ) (b) n →∞ ¯ → E µ ( f ( X ))) in L 2 − f n if E µ ( f ( X )) < ∞ , where � f n = 1 ¯ f ( x i ) n i =1 ...n Demonstration: Reflected Markov Chain ◭ ◮

  13. Algorithm for the Gibbs Sampler: 1. 0 �→ n 2. Sample x (0) from initial distribution, say uniform distribution on X 3. Apply K t on x ( n ) for all t ∈ S i.e. sample from local characteristics Π 1 in each point 4. copy x ( n ) to x ( n +1) 5. n + 1 �→ n 6. Return to step 3 until close enough to Π . Algorithm is a realization of a Markov chain with stationary distribution Π ◭ ◮

  14. A sweep ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ◭ ◮

  15. Visiting schedule ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ◭ ◮

  16. Visiting schedule ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ◭ ◮

  17. Visiting schedule ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ◭ ◮

  18. Visiting schedule ⑦ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ◭ ◮

  19. Visiting schedule ⑦ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ◭ ◮

  20. Visiting schedule ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ⑦ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ❥ ◭ ◮

  21. Visiting schedule ⑦ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ⑦ ⑦ ⑦ ❥ ❥ ❥ ⑦ ⑦ ❥ ❥ ⑦ ❥ ⑦ ❥ ⑦ ❥ ❥ ⑦ ❥ ❥ ⑦ ❥ ❥ ❥ ⑦ ❥ ❥ ⑦ ❥ ⑦ ❥ ❥ ❥ ⑦ ❥ ⑦ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ⑦ ❥ ❥ ❥ ◭ ◮

  22. Visiting schedule: Whole sweep finished ❥ ⑦ ⑦ ❥ ❥ ⑦ ⑦ ❥ ⑦ ❥ ⑦ ⑦ ❥ ⑦ ⑦ ❥ ❥ ⑦ ⑦ ❥ ⑦ ❥ ⑦ ❥ ❥ ⑦ ❥ ⑦ ⑦ ⑦ ❥ ❥ ⑦ ❥ ⑦ ⑦ ❥ ❥ ⑦ ❥ ⑦ ❥ ⑦ ⑦ ❥ ⑦ ❥ ❥ ⑦ ❥ ⑦ ❥ ⑦ ❥ ⑦ ❥ ⑦ ⑦ ❥ ❥ ⑦ ⑦ ❥ ❥ ⑦ ❥ ⑦ ❥ ⑦ ❥ ⑦ ❥ ⑦ ⑦ ❥ ◭ ◮

  23. Demonstrations • Ising Model, Gibbs Sampler • Ising Model + Channel noise, MMSE � � � x s + 1 p H ( x, y ) = − β x s x t + h 2 ln( 1 − p ) x s y s s ∼ t s s • Cooling Schemes, Simulated Annealing, ICM • Grey-valued “Ising Model” (Potts and others) � � H ( x ) = β ϕ ( x s , x t ) + h x s s ∼ t s ◭ ◮

  24. • Sampling from arbitrary Posterior Distributions � � � H ( y, x ) = β ϕ ( x s , x t ) + h x s + ϑ ( x s , y s ) , x ∼ t s s • Φ -Model, Texture Synthesis � � � H ( x ) = ϕ ( x s , x t ) + h β i x s s ∼ i t s i ◭ ◮

  25. . ◭ ◮

  26. Estimating (Hyper-)Parameters Assume observations on Λ = Λ + ∂ Λ (Conditional) Maximum-Likelihood estimator (MLE) � θ n := arg min − log( P θ ( X t = x t , t ∈ Λ | X s = x s , s ∈ ∂ Λ)) θ = arg min θ (log Z Λ ( x ∂ Λ ) − H Λ ( x Λ | x ∂ Λ )) Problem: Z Λ not computable . Solution: Subsampling method (Younes(88), Winkler(01)) Alternative approach: Estimators regarding only the conditional distri- butions P θ ( X t = x t | X s , s ∈ ∂ ( t )) like: Coding, Maximum-Pseudolikelihood, Minimal least squares, Minimum Chi Square estimator etc. ◭ ◮

  27. Coding Estimator(CE) With Λ + = { t ∈ Λ | t 1 + . . . + t d even } the MLE on Λ + becomes: � θ n = arg min − log( P ( X t = x t , t ∈ Λ + | X s = x s , s ∈ ∂ Λ + )) θ � = arg min − log P θ ( X t = x t | X s , s ∈ ∂ ( t )) . θ t ∈ Λ + Maximum-Pseudolikelihood Estimator (MPLE) Coding Estimator with replacement: Λ + − → Λ . � � θ n = arg min − log P θ ( X t = x t | X s , s ∈ ∂ ( t )) θ t ∈ Λ . Demonstration: Estimating Parameters ◭ ◮

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend