relation between five data assimilation methods for a
play

Relation between Five Data Assimilation Methods for a Simplistic - PowerPoint PPT Presentation

Relation between Five Data Assimilation Methods for a Simplistic Weakly Non-linear Problem Trond Mannseth Uni Research CIPR Methods Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Methods Randomized


  1. Relation between Five Data Assimilation Methods for a Simplistic Weakly Non-linear Problem Trond Mannseth Uni Research CIPR

  2. Methods

  3. Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates)

  4. Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update)

  5. Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update) Ensemble Smoother with Multiple Data Assimilations (ESMDA) (Sim. est., Multiple upd. with inflated data covariance)

  6. Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update) Ensemble Smoother with Multiple Data Assimilations (ESMDA) (Sim. est., Multiple upd. with inflated data covariance) Half-iteration Ensemble Kalman Filter (HIEnKF) (Sequential est., A single upd.)

  7. Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update) Ensemble Smoother with Multiple Data Assimilations (ESMDA) (Sim. est., Multiple upd. with inflated data covariance) Half-iteration Ensemble Kalman Filter (HIEnKF) (Sequential est., A single upd.) Half-iteration Ensemble Kalman Filter with MDA (HIEnKFMDA) (Seq. est., Multiple upd. with inflated data covariance)

  8. Motivation

  9. Motivation Methods are equivalent for gauss-linear problems

  10. Motivation Methods are equivalent for gauss-linear problems Methods behave differently for non-linear problems

  11. Motivation Methods are equivalent for gauss-linear problems Methods behave differently for non-linear problems RML is iterative, ES is purely non-iterative, while ESMDA, HIEnKF and HIEnKFMDA ‘lie somewhere in between’

  12. Motivation Methods are equivalent for gauss-linear problems Methods behave differently for non-linear problems RML is iterative, ES is purely non-iterative, while ESMDA, HIEnKF and HIEnKFMDA ‘lie somewhere in between’ Systematic differences for weakly non-linear problems?

  13. Investigation

  14. Investigation Compare methods on simplistic, weakly non-linear parameter estimation problem

  15. Investigation Compare methods on simplistic, weakly non-linear parameter estimation problem Focus on differences in data handling – remove other differences

  16. Investigation Compare methods on simplistic, weakly non-linear parameter estimation problem Focus on differences in data handling – remove other differences Asymptotic calculations (additional assumptions) to first order in non-linearity strength

  17. Investigation Compare methods on simplistic, weakly non-linear parameter estimation problem Focus on differences in data handling – remove other differences Asymptotic calculations (additional assumptions) to first order in non-linearity strength Numerical calculations with full methods and relaxed assumptions

  18. Simplistic Weakly Non-linear Parameter Estimation Problem

  19. Simplistic Weakly Non-linear Parameter Estimation Problem Estimate x from d , where

  20. Simplistic Weakly Non-linear Parameter Estimation Problem Estimate x from d , where d = ( d 1 . . . d D ) T ,

  21. Simplistic Weakly Non-linear Parameter Estimation Problem Estimate x from d , where d = ( d 1 . . . d D ) T , ǫ i ∼ N (0 , σ 2 d i = y i ( x ref ) + ǫ i ; i ),

  22. Simplistic Weakly Non-linear Parameter Estimation Problem Estimate x from d , where d = ( d 1 . . . d D ) T , ǫ i ∼ N (0 , σ 2 d i = y i ( x ref ) + ǫ i ; i ), y i ( x ) = � M m =1 c im x 1+ n im ; | n im | ‘not too big’ m

  23. Focus on Diff. in Data Handling – Remove Other Diff.

  24. Focus on Diff. in Data Handling – Remove Other Diff. Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains

  25. Focus on Diff. in Data Handling – Remove Other Diff. Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains K = C xy ( C yy + C d ) − 1 Kalman gain (global)

  26. Focus on Diff. in Data Handling – Remove Other Diff. Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains K = C xy ( C yy + C d ) − 1 Kalman gain (global) Replace C xy by C x G T e and C yy by G e C x G T e

  27. Focus on Diff. in Data Handling – Remove Other Diff. Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains K = C xy ( C yy + C d ) − 1 Kalman gain (global) Replace C xy by C x G T e and C yy by G e C x G T e � − 1 K e = C x G T � G e C x G T → Local gains e + C d e

  28. Additional Assumptions facilitating Asymptotic Calculations

  29. Additional Assumptions facilitating Asymptotic Calculations I consider the updates of a single (arbitrary) ensemble member

  30. Additional Assumptions facilitating Asymptotic Calculations I consider the updates of a single (arbitrary) ensemble member Assumptions y i ( x ) = x 1+ n i Univariate x →

  31. Additional Assumptions facilitating Asymptotic Calculations I consider the updates of a single (arbitrary) ensemble member Assumptions y i ( x ) = x 1+ n i Univariate x → Negligible data error → d i = y i ( x ref ),

  32. Additional Assumptions . . . (continued)

  33. Additional Assumptions . . . (continued) So far | n 1 | ∧ | n 2 | ‘not too big’ y 1 ( x ) d 1 y 2 ( x ) d 2 x ref

  34. Additional Assumptions . . . (continued) So far I assume | n 1 | ∧ | n 2 | ‘not too big’ n 1 = n 2 = n , | n | ≪ 1 y 1 ( x ) y ( x ) , y ( x ) d 1 d , d y 2 ( x ) d 2 x ref x ref

  35. Results from Asymptotic Calculations to O ( n )

  36. Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 )

  37. Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML |

  38. Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML | Q = | d (ln d − ln x prior ) − ( d − x prior ) |

  39. Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML | Q = | d (ln d − ln x prior ) − ( d − x prior ) | Qn + O ( n 2 ) ∆ ES =

  40. Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML | Q = | d (ln d − ln x prior ) − ( d − x prior ) | Qn + O ( n 2 ) ∆ ES = A − 1 Qn + O ( n 2 ) ∆ ESMDA =

  41. Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML | Q = | d (ln d − ln x prior ) − ( d − x prior ) | Qn + O ( n 2 ) ∆ ES = A − 1 Qn + O ( n 2 ) ∆ ESMDA = D − 1 Qn + O ( n 2 ) ∆ HIEnKF =

  42. Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML | Q = | d (ln d − ln x prior ) − ( d − x prior ) | Qn + O ( n 2 ) ∆ ES = A − 1 Qn + O ( n 2 ) ∆ ESMDA = D − 1 Qn + O ( n 2 ) ∆ HIEnKF = ( AD ) − 1 Qn + O ( n 2 ) ∆ HIEnKFMDA =

  43. Results from Asymptotic Calculations to O ( n ) ∆ ES ≈ Qn A − 1 Qn ∆ ESMDA ≈ D − 1 Qn ∆ HIEnKF ≈ ( AD ) − 1 Qn ∆ HIEnKFMDA ≈ dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 D

  44. Numerical Results with Full Methods dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 D

  45. Numerical Results with Full Methods dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 D ‘Ranking’ stable for n ∈ [ − 0 . 5 , 5]

  46. Num. Calc. with Full Methods and Relaxed Assumptions d = ( d 1 . . . d D ) T , ǫ i ∼ N (0 , σ 2 d i = y i ( x ref ) + ǫ i ; i ), y i ( x ) = � M m =1 c im x 1+ n im ; ¯ n im = 0 . 4 m

  47. Num. Calc. with Full Methods and Relaxed Assumptions d = ( d 1 . . . d D ) T , ǫ i ∼ N (0 , σ 2 d i = y i ( x ref ) + ǫ i ; i ), y i ( x ) = � M m =1 c im x 1+ n im ; ¯ n im = 0 . 4 m M = 1 , 2 , 5 .

  48. Num. Calc. with Full Methods and Relaxed Assumptions d = ( d 1 . . . d D ) T , ǫ i ∼ N (0 , σ 2 d i = y i ( x ref ) + ǫ i ; i ), y i ( x ) = � M m =1 c im x 1+ n im ; ¯ n im = 0 . 4 m M = 1 , 2 , 5 . Draw 300 realizations of x ref , x prior , n im , c im

  49. Num. Res. with Full Methods and Relaxed Assumptions M = 1, Arbitrary realization M = 1, Mean dES dES dHIEnKF dHIEnKF dESMDA2 dESMDA2 dESMDA4 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA2 dHIEnKFMDA4 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 D D

  50. Num. Res. with Full Methods and Relaxed Assumptions M = 2, Arbitrary realization M = 2, Mean dES dES dHIEnKF dHIEnKF dESMDA2 dESMDA2 dESMDA4 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA2 dHIEnKFMDA4 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 D D

  51. Num. Res. with Full Methods and Relaxed Assumptions M = 5, Arbitrary realization M = 5, Mean dES dES dHIEnKF dHIEnKF dESMDA2 dESMDA2 dESMDA4 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA2 dHIEnKFMDA4 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 D D

  52. Summary

  53. Summary Compared five different ways to assimilate data on simplistic, weakly non-linear parameter estimation problem

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend