joint state parameter estimation for nonlinear stochastic
play

Joint stateparameter estimation for nonlinear stochastic energy - PowerPoint PPT Presentation

Joint stateparameter estimation for nonlinear stochastic energy balance models Fei Lu 1 Nils Weitzel 2 Adam Monahan 3 1 Department of Mathematics, Johns Hopkins 2 Meteorological Institute, University of Bonn, Germany 3 School of Earth and Ocean


  1. Joint state–parameter estimation for nonlinear stochastic energy balance models Fei Lu 1 Nils Weitzel 2 Adam Monahan 3 1 Department of Mathematics, Johns Hopkins 2 Meteorological Institute, University of Bonn, Germany 3 School of Earth and Ocean Science, University of Victoria, Canada SIAM DS 2019, May 22 1 / 18

  2. Outline An SPDE from paleoclimate reconstruction 1 Stochastic energy balance model State space model representation Bayesian joint state-parameter estimation 2 Sampling the posterior: Particle MCMC Ill-posedness: regularized posterior Numerical results 3 Parameter estimation State estimation 2 / 18

  3. An SPDE from paleoclimate reconstruction Paleoclimate : reconstruct past climate temperature from proxy data the temperature: a spatio-temporal process ◮ physically laws: energy balance → SPDEs ◮ discretized: a high-D process with spatial correlation Sparse and noisy data ◮ Proxy data: historical data, tree rings, ice cores, fossil pollen, ocean sediments, coral etc. Plan: inference of SPDEs from sparse noisy data joint state-parameter estimation 3 / 18

  4. The SPDEs: stochastic Energy Balance Models Idealized atmospheric energy balance ( Fanning&Weaver1996 ) ∂ t u = Q T + Q SW + Q SH + Q LH + Q LW − Q LPW ���� ���� ���� ���� ���� � �� � transport absorbed sensible latent longwave longwave heat shortwave heat into space surf. → atmos. ∇ · ( ν ∇ u ) + θ 0 + θ 1 u + θ 4 u 4 = + W ( t , x ) � �� � g θ ( u ) θ = ( θ k ) : unknown parameters ◮ prior: a range of physical values ◮ g θ ( u ) has a stable fixed point W ( t , x ) : Gaussian noise, ◮ white-in-time Matern-in-space Data: sparse noisy observations 4 / 18

  5. State space model formulation � θ k u k + W ( t , x ) SEBM: ∂ t u = ∇ · ( ν ∇ u ) + k = 0 , 1 , 4 Observation data: y t i = H ( u ( t i , x )) + V i Discretization (simplification): finite elements in space semi-backward Euler in time State space model SEBM: U n = g ( θ, U n − 1 ) + W n = HU n + V n Observation data: y n 5 / 18

  6. Joint parameter-state estimation SEBM: U n = g ( θ, U n − 1 ) + W n Observation data: y n = HU n + V n Goal: Given y 1 : N , we would like to jointly estimate ( θ, U 1 : N ) Gaussian prior for θ 12 spatial nodes, 100 time steps Solution at time step n =10 Trajectories of all 12 nodes 1 1.06 1.06 8 3 0.8 1.04 1.04 0.6 4 7 1.02 1.02 0.4 0.2 1 1 2 / 0 9 12 1 2 u n 0.98 0.98 -0.2 -0.4 0.96 0.96 5 11 -0.6 0.94 0.94 -0.8 10 6 -1 0.92 0.92 -1 -0.5 0 0.5 1 0 20 40 60 80 100 1 / Time step n 6 / 18

  7. Bayesian joint state-parameter estimation Bayesian approach: p ( θ, u 1 : N | y 1 : N ) ∝ p ( θ ) p ( u 1 : N | θ ) p ( y 1 : N | u 1 : N ) Posterior: quantifies the uncertainties Approximate the posterior by sampling high dimensional ( > 10 3 ), non-Gaussian, mixed types of variables θ, u 1 : N Gibbs Monte Carlo: U 1 : N | θ and θ | U iteration ◮ U 1 : N | θ needs highD proposal density → Sequential MC ◮ combine SMC with Gibbs (MCMC) → Particle MCMC methods based on conditional SMC 7 / 18

  8. Sampling: particle MCMC Particle MCMC (Andrieu&Doucet&Holenstein10) Combines Sequential MC with MCMC: ◮ SMC: seq. importance sampling → highD proposal density ◮ conditional SMC: keep a reference trajectory in SMC ◮ MCMC transition by conditional SMC → target distr invariant even w/ a few particles Particle Gibbs with Ancestor Sampling (Lindsten&Jordan&Schon14) ◮ Update the ancestor of the reference trajectory ◮ Improving mixing of the chain 8 / 18

  9. Ill-posed inverse problem For the Gaussian prior p ( θ ) , unphysical samples of posterior: systems blowing up 9 / 18

  10. Ill-posed inverse problem For the Gaussian prior p ( θ ) , unphysical samples of posterior: systems blowing up Parameter estimation is ill-posed: Singular Fisher infomation matrix → large oscillation in sample θ from Gibbs θ | � U 1 : N Std of errors of MLE from noisy observations Std of errors (log10) 3 0 1 4 2 1 2 3 4 5 Data size (log 10 N) 10 / 18

  11. Regularized posterior Recall the regularization in variational approach ( � θ, � Variational: u 1 : N ) = arg min C λ, y 1 : N ( θ, u 1 : N ) ( θ, u 1 : n ) p λ ( θ, u 1 : N | y 1 : N ) ∝ p ( θ ) λ p ( y 1 : N | u 1 : N ) p ( u 1 : N | θ ) Bayesian : C λ, y 1 : N ( θ, u 1 : N ) = λ log p ( θ ) + log [ p ( y 1 : N | u 1 : N ) p ( u 1 : N | θ )] � �� � � �� � regularization likelihood � � log p ( θ ) + 1 = λ λ log [ p ( y 1 : N | u 1 : N ) p ( u 1 : N | θ )] λ = 1: Standard posterior N →∞ →∼ likelihood 1 − − − − λ = N : regularized posterior p λ ( θ, u 1 : N | y 1 : N ) ∝ p ( θ ) [ p ( y 1 : N | u 1 : N ) p ( u 1 : N | θ )] 1 / N 1 Bernstein-von Mises theorem 11 / 18

  12. Parameter estimation 1 Posterior True value 0.5 Posterior mean Prior 0 27 28 29 30 31 32 33 0 1.5 1 0.5 0 -26 -25.5 -25 -24.5 -24 -23.5 -23 -22.5 1 10 5 0 -6.2 -6 -5.8 -5.6 -5.4 -5.2 -5 -4.8 -4.6 4 posterior close to prior; Errors in 100 simulations θ 0 θ 1 θ 4 Posterior mean -0.44 ± 0.58 0.09 ± 0.42 0.11 ± 0.20 MAP -0.32 ± 0.61 0.02 ± 0.42 0.03 ± 0.21 12 / 18

  13. State estimation Unobserved node: large spread Observed node: noise filtered 13 / 18

  14. State estimation Unobserved node: large spread Observed node: noise filtered time = 20 time = 60 time = 100 time = 20 time = 60 time = 100 0.6 0.6 0.6 Posterior 0.5 0.5 0.5 Posterior True True Mean 0.5 Mean 0.5 0.5 0.4 0.4 0.4 0.4 0.4 0.4 Probability Probability 0.3 0.3 0.3 0.3 0.3 0.3 0.2 0.2 0.2 0.2 0.2 0.2 0.1 0.1 0.1 0.1 0.1 0.1 0 0 0 0 0 0 0.9 1 1.1 0.9 1 1.1 0.9 1 1.1 0.95 1 1.05 0.95 1 1.05 0.95 1 1.05 State State State State State State 14 / 18

  15. Observing more or less nodes: When more modes are observed: State estimation gets more accurate Parameter estimation does not improve much: The posterior keeps close to prior due to the need of regularization 15 / 18

  16. Summary and discussion Bayesian approach to jointly estimate parameter-state a stochastic energy balance model sparse and noisy data Ill-posed parameter estimation problem (The parameters are correlated on a lowD manifold) Introduced a regularized posterior: Enabling state estimation Large uncertainty in parameter estimation due to ill-posedness 16 / 18

  17. Open questions 1. Re-parametrization/ nonparametric to avoid ill-posedness? 2. How many nodes need to be observed (for large mesh)? (theory of determining modes) 17 / 18

  18. Open questions 1. Re-parametrization/ nonparametric to avoid ill-posedness? 2. How many nodes need to be observed (for large mesh)? (theory of determining modes) Thank you! 18 / 18

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend