on the bayesian formulation of fractional inverse
play

On the Bayesian formulation of fractional inverse problems and - PowerPoint PPT Presentation

On the Bayesian formulation of fractional inverse problems and data-driven discretization of forward maps Nicols Garca Trillos Brown University Fractional PDEs: Theory, Algorithms and Applications ICERM June 18th 2018 Nicols Garca


  1. On the Bayesian formulation of fractional inverse problems and data-driven discretization of forward maps Nicolás García Trillos Brown University Fractional PDEs: Theory, Algorithms and Applications ICERM June 18th 2018 Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  2. Outline 1 Bayesian formulation of fractional inverse problems. 2 Data driven discretization of forward maps. Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  3. This presentation mostly based on: 1 The Bayesian Formulation and Well-Posedness of Fractional Elliptic Inverse Problems (2017 Inverse Problems) with D. Sanz-Alonso. 2 Data driven discretizations of forward maps in Bayesian inverse problems (In preparation) with D. Bigoni, Y. Marzouk and D. Sanz-Alonso. Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  4. Part 1: Bayesian formulation of fractional inverse problems. Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  5. Inverse problem : learn a permeability field from partial and noisy observations of pressure field. PDE version : Learn diffusion coefficient and order of a (FPDE) based on partial and noisy observations of its solution. Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  6. Inverse problem : learn a permeability field from partial and noisy observations of pressure field. PDE version : Learn diffusion coefficient and order of a (FPDE) based on partial and noisy observations of its solution. u = ( s , A ) F ( u ) O ◦ F ( u ) G := O ◦ F y = G ( u ) + Noise Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  7. Inverse problem : learn a permeability field from partial and noisy observations of pressure field. PDE version : Learn diffusion coefficient and order of a (FPDE) based on partial and noisy observations of its solution. u = ( s , A ) F ( u ) O ◦ F ( u ) G := O ◦ F φ ( y ; G ( u )) Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  8. Forward map: p = F ( u ). � L s = f , in D , A p (1) ∂ A p = 0 , on ∂ D , where ∂ A p := A ( x ) ∇ p · ν, and ν is the exterior unit normal to ∂ D . Observation map: O ( p ) := ( p ( x 1 ) , . . . , p ( x n )) for some x i ∈ D . � 2 γ 2 � y − G ( u ) � 2 � − 1 Noise model: φ ( y , G ( u )) = exp . Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  9. Forward map: p = F ( u ). � L s A p = f , in D , (2) ∂ A p = 0 , on ∂ D , where ∂ A p := A ( x ) ∇ p · ν, and ν is the exterior unit normal to ∂ D . Here, ∞ L s � λ s A p = A , k p k ψ A , k . k =1 Observation map: O ( p ) := ( p ( z 1 ) , . . . , p ( z m )) for some z i ∈ D . � 2 γ 2 � y − G ( u ) � 2 � − 1 Noise model: φ ( y , G ( u )) = exp . Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  10. Bayesian approach to inverse problems A. M. Stuart. Inverse problems: a Bayesian perspective. (2010). J. Kaipio and E. Somersalo. Statistical and computational inverse problems (2006). Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  11. Bayesian formulation Prior: u ∼ π u Likelihood model: π y | u Bayes rule (informally): ν y ( u ) := π u | y ∝ π y | u · π u Posterior distribution . Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  12. ν y is the fundamental object in Bayesian inference. Estimates: E u ∼ ν y ( R ( u )) Uncertainty quantification: Var u ∼ ν y ( R ( u )) Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  13. Advantages of Bayesian formulation Well defined mathematical framework: Stability (Well-posedness). Posterior consistency (contraction rates, scalings for parameters, etc). Consistency of numerical methods. Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  14. Prior π u is a distribution on (0 , 1) × H . For example, π u = π s ⊗ π A A = e v I d , where v ∼ N (0 , K ) Karhunen-Loeve expansion: ∞ � v = λ K , i ζ i Ψ i , ζ i ∼ N (0 , 1) . i =1 Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  15. Well-posedness of Bayesian formulation Theorem (NGT and D. Sanz-Alonso 17’) Suppose that G is continuous in supp ( π u ) . Then, posterior distribution ν y is absolutely continuous with respect to prior: d ν y ( u ) ∝ φ ( y ; G ( u )) d π u ( u ) , Recall: G : ( s , A ) → R m . Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  16. Well-posedness of Bayesian formulation Theorem (NGT and D. Sanz-Alonso 17’) Suppose that G ∈ L 2 π u . Then the map y �→ ν y is Locally Lipschitz in the Hellinger distance. That is, For | y 1 | , | y 2 | ≤ r we have d hell ( ν y 1 , ν y 2 ) ≤ C r � y 1 − y 2 � . Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  17. Well-posedness of Bayesian formulation The analysis reduces to studying stability through regularity of FPDEs. L. A. Caffarelli and P. R. Stinga. Fractional elliptic equations, Caccioppoli estimates and regularity. (2016) Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  18. Well-posedness and posterior consistency M. Dashti and A. M. Stuart. Uncertainty quantification and weak approximation of an elliptic inverse problem. (2011). S. Agapiou, S. Larsson, and A.M. Stuart. Posterior contraction rates for the Bayesian approach to linear ill-posed inverse problems. (2013) . S. Volmer. Posterior consistency for Bayesian inverse problems through stability and regression results. (2013). Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  19. Numerical methods: MCMC Need a way to approximate expectations with respect to ν y . Standard procedure: MCMC. Generate a path of a Markov chain with invariant distribution ν y : u 1 , . . . , u k , . . . and then use k 1 � R ( u i ) k i =1 However, careful with: Discretization of u . 1 Discretization of forward map. 2 Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  20. Idealized MCMC algorithm For the sake of simplicity assume A = e v · I d and known s ∈ (0 , 1) . Metropolis Hastings with pCN proposal: Having defined v k , v k +1 is generated according to: 1 Proposal: ˜ � 1 − β 2 v k + βξ , where ξ ∼ π v . v = 2 Acceptance probability: 1 , φ ( y ; G (˜ u ) � � α (˜ u , u k ) := min φ ( y ; G ( u k )) � ˜ with prob α (˜ u , u k ) v 3 v k +1 := v k with prob 1 − α (˜ u , u k ) Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  21. Idealized MCMC algorithm For the sake of simplicity assume A = e v · I d and known s ∈ (0 , 1) . Metropolis Hastings with pCN proposal: Having defined v k , v k +1 is generated according to: 1 Proposal: ˜ � 1 − β 2 v k + βξ , where ξ ∼ π v . v = Compare to: ˜ v = v k + βξ S. L. Cotter, G. O. Roberts, A. M. Stuart, and D. White. MCMC methods for functions: modifying old algorithms to make them faster. Statistical Science. 2 Acceptance probability: 1 , φ ( y ; G (˜ u ) � � α (˜ u , u k ) := min φ ( y ; G ( u k )) Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  22. Idealized MCMC algorithm For the sake of simplicity assume A = e v · I d and known s ∈ (0 , 1) . Metropolis Hastings with pCN proposal: Having defined v k , v k +1 is generated according to: 1 Proposal: ˜ � 1 − β 2 v k + βξ , where ξ ∼ π v . v = Compare to: ˜ v = v k + βξ S. L. Cotter, G. O. Roberts, A. M. Stuart, and D. White. MCMC methods for functions: modifying old algorithms to make them faster. Statistical Science. Robustness to truncation: L � ξ = λ K , i ζ i Ψ i , ζ i ∼ N (0 , 1) i =1 2 Acceptance probability: 1 , φ ( y ; G (˜ u ) � � α (˜ u , u k ) := min φ ( y ; G ( u k )) Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  23. Part 2: Data driven discretization of forward maps Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

  24. F ( u ) O ◦ F ( u ) u G := O ◦ F φ ( y ; G ( u )) Nicolás García Trillos Brown University On the Bayesian formulation of fractional inverse problems and data-driven

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend