statistical image segmentation with bayesian approach
play

Statistical image segmentation with Bayesian approach Tapio Helin - PowerPoint PPT Presentation

Introduction Hierarchical stochastic model Conclusions Statistical image segmentation with Bayesian approach Tapio Helin Institute of Mathematics, Helsinki University of Technology Workshop on Inverse and Partial Information Problems:


  1. Introduction Hierarchical stochastic model Conclusions Statistical image segmentation with Bayesian approach Tapio Helin Institute of Mathematics, Helsinki University of Technology Workshop on Inverse and Partial Information Problems: Methodology and Applications Linz, October 28, 2008 in collaboration with Professor Matti Lassas, Helsinki University of Technology

  2. Introduction Hierarchical stochastic model Conclusions Outline Introduction 1 Hierarchical stochastic model 2 Conclusions 3

  3. Introduction Hierarchical stochastic model Conclusions Outline Introduction 1 Hierarchical stochastic model 2 Conclusions 3

  4. Introduction Hierarchical stochastic model Conclusions Hierarchical priors Suppose some parameter of the prior distribution is unknown or it is preferable not to approximate it exactly. The prior is called hierarchical if these parameters are modelled also with random variables. One way to think about it: suppose the distribution of U depends on V � A � � U � 0 M = AU + E ⇒ M = + E 0 0 V

  5. Introduction Hierarchical stochastic model Conclusions Data segmentation In 1984 S. Geman and D. Geman proposed a statistical approach to image denoising. Let U , M and E be R n -valued random variables such that M = AU + E , where E ∼ N (0 , σ 2 I ). They introduced a { 0 , 1 } n -valued random variable L that described the edge set. The a priori distribution for the pair ( U , L ) was � �� � � 2 )( u i +1 − u i ) 2 + βℓ i + 1 π pr ( u , ℓ ) ∝ exp − α (1 − ℓ i + 1 . 2 i

  6. Introduction Hierarchical stochastic model Conclusions Data segmentation In this setting the posteriori distribution is π ( u , ℓ | m ) ∝ exp ( − E ( u , ℓ, m )) where the free energy E ( u , ℓ, m ) is given by � � � � ( u i +1 − u i ) 2 + E ( u , ℓ, m ) = 1 − ℓ i + 1 α 2 i � 1 2 σ 2 (( Au ) i − m i ) 2 + βℓ i + 1 2 + . G-G proposed to maximize the π post , i.e., to solve the MAP estimate.

  7. Introduction Hierarchical stochastic model Conclusions Mumford-Shah -functional In 1989 Mumford and Shah presented the following continuous version of G-G’s method � � | Du | 2 d x + ♯ ( K ) + | u − m | 2 d x arg min (1) u , K T \ K T where K is the ”discontinuity” or the ”jump-set” of a piecewise regular function u ( x ).

  8. Introduction Hierarchical stochastic model Conclusions Ambrosio-Tortorelli Ambrosio and Tortorelli proposed approximating Mumford-Shah functional by elliptic functionals � ( v 2 + ǫ 2 ) | Du | 2 + ǫ | Dv | 2 + (1 − v ) 2 + | Au − m | 2 d x . F ǫ ( u , v ) = 4 ǫ T Now F ǫ Γ-converges in L 1 ( T ) × L 1 ( T ) -topology to the weak formulation of Mumford-Shah -functional: � � | Du | 2 d x + ♯ ( S u ) + | Au − m | 2 d x F ( u , v ) = T \ S u T where u ∈ SBV ( T ) and v ( x ) = 1 almost everywhere. Otherwise F ( u , v ) = ∞ .

  9. Introduction Hierarchical stochastic model Conclusions Ambrosio-Tortorelli Minimizing � ( v 2 + ǫ 2 ) | Du | 2 + ǫ | Dv | 2 + (1 − v ) 2 + | u − m | 2 d x 4 ǫ T results to m u v

  10. Introduction Hierarchical stochastic model Conclusions Ambrosio-Tortorelli If this � ( v 2 + ǫ 2 ) | Du | 2 + ǫ | Dv | 2 + (1 − v ) 2 + | Au − m | 2 d x arg min 4 ǫ u , v ∈ H 1 ( T ) T is what you want to happen qualitatively - how to do it?

  11. Introduction Hierarchical stochastic model Conclusions Outline Introduction 1 Hierarchical stochastic model 2 Conclusions 3

  12. Introduction Hierarchical stochastic model Conclusions Motivation Suppose you are given a linear inverse problem M = AU + E in R n with a hierarchical prior such that U ∝ N ( u 0 , C U ( v )) V ∝ N ( v 0 , C V ) and and E white noise.

  13. Introduction Hierarchical stochastic model Conclusions Motivation Then π post ( u , v | m ) ∝ exp( − 1 2 E ( u , v )) with a free energy of the form E ( u , v ) = log (det C U ( v )) + � u − u 0 , C U ( v ) − 1 ( u − u 0 ) � + + � v − v 0 , C − 1 V ( v − v 0 ) � + � Au − m � 2 .

  14. Introduction Hierarchical stochastic model Conclusions More motivation Put u 0 = 0, v 0 = 1 , � 1 � − 1 C U ( v ) = ( − D n ( ǫ 2 + v 2 ) D n ) − 1 C V = 4 ǫ Id − ǫ ∆ n and Then the free energy is close to A-T functional, in fact, log (det C U ( v )) + � u , − D n ( ǫ 2 + v 2 ) D n u � E ( u , v ) = � 1 � ( v − 1) � + � Au − m � 2 + � v − 1 , 4 ǫ Id − ǫ ∆ n � n log( ǫ 2 + v 2 ) dx + F ǫ ( u , v ) . = − T

  15. Introduction Hierarchical stochastic model Conclusions Things to consider One could ask e.g. (i) is the corresponding L 2 -valued model well-defined? (ii) how to discretize invariantly? (iii) how does the posterior distribution behave asymptotically? (iv) are we still doing what we were supposed to?

  16. Introduction Hierarchical stochastic model Conclusions About MAP estimate Consider the minimization problem � − n log( ǫ 2 + v 2 ) + ( ǫ 2 + v 2 ) | D n u | 2 + min u , v ∈ H 1 ( T ) ∩ X n T + 1 4 ǫ (1 − v ) 2 + ǫ | D n v | 2 + | Au − m | 2 dx By letting u = 0 and v = 1 we notice that the minimum value decreases as n gets larger. Also we see that minimizers diverge.

  17. Introduction Hierarchical stochastic model Conclusions Definition of V To simplify our notations we assume that the probability space has the structure Ω = Ω 1 × Ω 2 , Σ = Σ 1 × Σ 2 and P = P 1 × P 2 . Define V as Gaussian random variable on L 2 ( T ) with mean v 0 = 1 � 1 � − 1 . Assume that and covariance operator C V = 4 ǫ Id − ǫ ∆ V = V ( ω 2 ). Lemma For 0 < α < 1 2 we have V ∈ C 0 ,α ( T ) almost surely.

  18. Introduction Hierarchical stochastic model Conclusions Definition of U Define a mapping � U for every test function φ ∈ C ∞ ( T ) φ �→ { ω �→ � � W ( ω 1 ) , A V ( ω 2 ) φ � | ω = ( ω 1 , ω 2 ) ∈ Ω } , D ∗ ( ǫ 2 + v 2 )˜ where A v = (˜ D ) − 1 / 2 . Lemma The mapping � U is a generalized random variable. Corollary We have U ∈ L 2 ( T ) almost surely.

  19. Introduction Hierarchical stochastic model Conclusions Exponential moment When dealing with additive Gaussian noise the following property is needed. Theorem For every b > 0 there exists a constant C b > 0 such that the exponential moments satisfy E e b � ( U , V ) � L 2 × L 2 < C b E e b � ( U n , V n ) � L 2 × L 2 < C b . and for all n ∈ N .

  20. Introduction Hierarchical stochastic model Conclusions CM estimates converge Corollary Given the introduced prior, Gaussian noise and weakly converging discretization scheme, the CM estimates converge. A numerical example: deconvolution m u v

  21. Introduction Hierarchical stochastic model Conclusions Outline Introduction 1 Hierarchical stochastic model 2 Conclusions 3

  22. Introduction Hierarchical stochastic model Conclusions Conclusions Summary: (1) We introduced a new prior model for segmenting signals. (2) CM estimates converge for a linear problem. (3) Connection to Mumford-Shah functional. Future work includes (1) understanding the limiting estimate better, (2) analysing error and (3) understanding MAP estimates better.

  23. Introduction Hierarchical stochastic model Conclusions If you want to know more... M. Lassas, E. Saksman, S.Siltanen: Discretization invariant Bayesian inversion and Besov space priors, submitted. S. Geman, D. Geman: Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images. IEEE Trans. PAMI, PAMI-6(6) , 1984. D. Mumford, J. Shah: Optimal approximation by piecewise smooth functions and associated variational problems. Comm. Pure Appl. Math. , 1989. L. Ambrosio, V.M.Tortorelli: Approximation of functionals depending on jumps by elliptic functionals via Γ-convergence. Comm. Pure Appl. Math. , 1990.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend