inverse modeling with the aid of surrogate models
play

Inverse Modeling with the aid of Surrogate Models Dongxiao Zhang, - PowerPoint PPT Presentation

Inverse Modeling with the aid of Surrogate Models Dongxiao Zhang, Qinzhuo Liao, Haibin Chang College of Engineering Peking (Beijing) University dxz@pku.edu.cn The 2017 EnKF Workshop Inverse modeling Estimate parameters from physical


  1. Inverse Modeling with the aid of Surrogate Models Dongxiao Zhang, Qinzhuo Liao, Haibin Chang College of Engineering Peking (Beijing) University dxz@pku.edu.cn The 2017 EnKF Workshop

  2. Inverse modeling • Estimate parameters from physical models and observations model: f input parameter: θ output observation: s s = f( θ ) + e Input Model Output  conductivity  groundwater supply  hydraulic head  porosity  contaminant control  velocity/flux  boundary condition  oil and gas production  phase saturation  source & sink  CO 2 sequestration  solute concentration 2

  3. Stochastic approach • Bayesian inference  ( ): prior density p    ( ) s f e  ( | ): posterior density p s   ( ) ( | ) p p s    ( | ) p s ( | ):likelihood function p s ( ) p s ( ): normalization factor p s • Markov chain Monte Carlo − Use Monte Carlo simulations to construct a Markov chain − Computationally expensive: repeated evaluations of the forward model • Surrogate model − Can generate a large number of samples at low cost − Posterior error depends on forward solution error 3

  4. Forwar ward d Stochastic ochastic Formulation ulation • SPDE:    ξ ξ ξ ( ; , ) ( , ), , , L u x g x P x D     ξ ) T where ( , ,..., 1 2 N which has a finite (random) dimensionality. • Weak form solution:            ˆ ( ; , ) ( ) ( ) ( , ) ( ) ( ) L u x w p d g x w p d P P where    ˆ ( , ) , where trial function space u x V V    ( ) , where test (weighting) function space w W W p    ξ ( ) probability density function of ( )

  5. Forwar ward d Stochastic ochastic Metho ethods ds • Galerkin polynomial chaos expansion (PCE) [e.g., Ghanem and Spanos, 1991]:        M    M ( ) , ( ) V span W span   i i 1 1 i i     M  where ( ) orthogonal polynomials i  i 1 • Probabilistic collocation method (PCM) [ Tatang et al., 1997; Sarma et al ., 2005; Li and Zhang , 2007, 2009]:             M M ( ) , ( ) V span W span   i i i 1 i 1 • Stochastic collocation method (SCM) [Mathelin et al., 2005; Xiu and Hesthaven, 2005; Chang and Zhang, 2009]:            M M ( ) , ( ) V span L W span   i i 1 1 i i    M where { ( )} lagrange interpolation basis L 1 i i

  6. Key y Components ponents fo for Stochastic ochastic Method ethods • Random dimensionality of underlying stochastic fields • How to effectively approximate the input random fields with finite dimensions • Karhunen-Loeve and other expansions may be used • Trial function space • How to approximate the dependent random fields • Perturbation series, polynomial chaos expansion, or Lagrange interpolation basis • Test (weighting) function space • How to evaluate the integration in random space? • Intrusive or non-intrusive schemes?

  7. Stochastic collocation method ( SCM ) • Based on polynomial interpolations in random space      M is a set of nodes in -dimensional random space M N  N i 1 i Xiu and Hesthaven (2005) M         ( ) ( ) ( ) f f L i i  1 i    M         j ( ) , ( ) ,1 , L L i j M    i i j ij  j 1 i j  j i • Collocation points: Smolyak sparse grid algorithm    Xiu and Hesthaven (2005) 1 N          q i      i i ( , ) 1 ... q N U U 1 N   q i      1 q N i q is univariate interpolation U • Converge fast in case of smooth functions Chang & Zhang (2009) Lin & Tartakovsky (2009) 7

  8. Choices ices of C f Colloca ocation tion Points ts • Tensor product of one-dimensional nodal sets Each dimension: knots m 3 2  N dimension: N M m 1 0 -1 • Smolyak sparse grid (level: k = q-N ) -2 -3 For N>1, preserving interpolation -3 -2 -1 0 1 2 3 property of N=1 with a small number 3 2 of knots 1 0 -1 • Tensor product vs. level-2 sparse grid -2 • N =2, 49 knots vs. 17 (shown right) -3 -3 -2 -1 0 1 2 3 • N =6, 117,649 knots vs. 97

  9. MCS S vs. . PCM/S M/SCM CM 2 = 1.0 MC: 1000 realizations  = 4.0,  Y MCS: 7 • Random sampling of 6.8 ( realizations ) 6.6 • Equal weights for h j 6.4 6.2 ( realizations ) Head, h 6 5.8 2 nd order PCM: 28 representations,  = 4.0,  Y 2 = 1.0 5.6 7 5.4 6.8 5.2 6.6 5 0 1 2 3 4 5 6 7 8 9 10 x 6.4 6.2 PCM/SCM: Head, h 6 • Structured sampling 5.8 ( collocation points ) 5.6 • Non-equal weights for h j 5.4 5.2 ( representations ) 5 0 1 2 3 4 5 6 7 8 9 10 x

  10. Stochastic collocation method • Inaccurate results − Non-physical realizations/Gibbs oscillation − Inaccurate statistical moments and probability density functions Lin & Tartakovsky (2009) Zhang et al. (2010) 10

  11. Stochastic collocation method • Inaccurate results − When: advection dominated ( Pe = 100)  low regularity − Why: physical space  random space • Illustration − Unit mass instantaneously released at x = 0, t = 0 − Input parameter: conductivity k = exp(0.3 θ ), θ ~ N (0,1) − Output response: concentration c at x = 0.3, t = 1 11

  12. Transformed stochastic collocation method • Stochastic collocation method (SCM) t  − Approximate s as a function of θ at fixed x and t ( , ; ) s x • Transformed stochastic collocation method (TSCM) s t  ( , ; ) x − Approximate x as a function of θ for a given s at fixed t s  − Approximate t as a function of θ for a given s at fixed x ( , ; ) t x Liao & Zhang (WRR, 2016) 12

  13. 1D example • Continuous injection − Input parameter: conductivity k = exp(0.3 θ ), θ ~ N (0,1), − Output response: concentration c at x = 0.3, t = 1 13

  14. 1D example • Forward solution approximation and posterior approximation true parameter θ = 0.2 true observation c = 0.842 14 observation error e ~ N (0, 0.01)

  15. 1D example • Convergence rate Marzouk & Xiu (2009) M     2               surrogate model: ( ) ( ) ( ) error: ( ) ( ) ( ) 0, s f L s s s s p d M M i i M 2 M L  1 i   ( )                 poster ior: ( ) ( | ) Kullback-Leibler diverg ence : || ( )log M 0, p s D d M   M M ( ) 15

  16. 2D example • Assume conductivity field is known • Top and bottom are no-flow boundaries , right head h 2 =0 • One instantaneous release location (circle), four observation wells (triangles) • Input parameters: release time t 0 ∈ [0,20], mass m 0 ∈ [1,2], left head h 1 ∈ [3,10] • Output responses: concentration c from t = 0 to 80, observation error e ~ N (0, 0.001) 16

  17. 2D example • Compare true concentration and approximated concentration 17

  18. 2D example • Surrogate approximation error • Adaptive transformed SCM (ATSCM) − Dimension-adaptive: automatically select important dimensions Klimke (2006) Liao et al. (JCP, 2016) − Further reduce the number of collocation points 18

  19. 2D example • Marginal PDF − MCMC with 10 5 model runs as a reference − ATSCM with 67 model runs is more accurate than SCM with 6017 model runs 19

  20. 2D example • Marginal PDF − Black: MCMC, red: SCM, blue: ATSCM 20

  21. Inverse modeling • Maximizing the posterior PDF obs ( ) ( | ) p m p d m  obs ( | ) p m d obs ( ) p d • For Gaussian prior and error, minimizing an objective function 1 1         1 1 obs T obs pr T pr ( ) ( ( ) ) ( ( ) ) ( ) ( ). J m g m d C g m d m m C m m D M 2 2 • Ensemble based methods EnKF ES Iterative ES  Sequentially assimilate  Simultaneously assimilate  Simultaneously assimilate the data all the data all the data  One step method  One step method  Multi-step method  Moderate simulation  Small simulation  Large simulation effort (restart required ) effort (no restart) effort (no restart, iteration)  Suitable for highly non-linear problems 21

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend