Relation between Five Data Assimilation Methods for a Simplistic - - PowerPoint PPT Presentation

relation between five data assimilation methods for a
SMART_READER_LITE
LIVE PREVIEW

Relation between Five Data Assimilation Methods for a Simplistic - - PowerPoint PPT Presentation

Relation between Five Data Assimilation Methods for a Simplistic Weakly Non-linear Problem Trond Mannseth Uni Research CIPR Methods Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Methods Randomized


slide-1
SLIDE 1

Relation between Five Data Assimilation Methods for a Simplistic Weakly Non-linear Problem

Trond Mannseth

Uni Research CIPR

slide-2
SLIDE 2

Methods

slide-3
SLIDE 3

Methods

Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates)

slide-4
SLIDE 4

Methods

Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update)

slide-5
SLIDE 5

Methods

Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update) Ensemble Smoother with Multiple Data Assimilations (ESMDA) (Sim. est., Multiple upd. with inflated data covariance)

slide-6
SLIDE 6

Methods

Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update) Ensemble Smoother with Multiple Data Assimilations (ESMDA) (Sim. est., Multiple upd. with inflated data covariance) Half-iteration Ensemble Kalman Filter (HIEnKF) (Sequential est., A single upd.)

slide-7
SLIDE 7

Methods

Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update) Ensemble Smoother with Multiple Data Assimilations (ESMDA) (Sim. est., Multiple upd. with inflated data covariance) Half-iteration Ensemble Kalman Filter (HIEnKF) (Sequential est., A single upd.) Half-iteration Ensemble Kalman Filter with MDA (HIEnKFMDA) (Seq. est., Multiple upd. with inflated data covariance)

slide-8
SLIDE 8

Motivation

slide-9
SLIDE 9

Motivation

Methods are equivalent for gauss-linear problems

slide-10
SLIDE 10

Motivation

Methods are equivalent for gauss-linear problems Methods behave differently for non-linear problems

slide-11
SLIDE 11

Motivation

Methods are equivalent for gauss-linear problems Methods behave differently for non-linear problems RML is iterative, ES is purely non-iterative, while ESMDA, HIEnKF and HIEnKFMDA ‘lie somewhere in between’

slide-12
SLIDE 12

Motivation

Methods are equivalent for gauss-linear problems Methods behave differently for non-linear problems RML is iterative, ES is purely non-iterative, while ESMDA, HIEnKF and HIEnKFMDA ‘lie somewhere in between’ Systematic differences for weakly non-linear problems?

slide-13
SLIDE 13

Investigation

slide-14
SLIDE 14

Investigation

Compare methods on simplistic, weakly non-linear parameter estimation problem

slide-15
SLIDE 15

Investigation

Compare methods on simplistic, weakly non-linear parameter estimation problem Focus on differences in data handling – remove other differences

slide-16
SLIDE 16

Investigation

Compare methods on simplistic, weakly non-linear parameter estimation problem Focus on differences in data handling – remove other differences Asymptotic calculations (additional assumptions) to first order in non-linearity strength

slide-17
SLIDE 17

Investigation

Compare methods on simplistic, weakly non-linear parameter estimation problem Focus on differences in data handling – remove other differences Asymptotic calculations (additional assumptions) to first order in non-linearity strength Numerical calculations with full methods and relaxed assumptions

slide-18
SLIDE 18

Simplistic Weakly Non-linear Parameter Estimation Problem

slide-19
SLIDE 19

Simplistic Weakly Non-linear Parameter Estimation Problem

Estimate x from d, where

slide-20
SLIDE 20

Simplistic Weakly Non-linear Parameter Estimation Problem

Estimate x from d, where d = (d1 . . . dD)T,

slide-21
SLIDE 21

Simplistic Weakly Non-linear Parameter Estimation Problem

Estimate x from d, where d = (d1 . . . dD)T, di = yi (xref ) + ǫi; ǫi ∼ N(0, σ2

i ),

slide-22
SLIDE 22

Simplistic Weakly Non-linear Parameter Estimation Problem

Estimate x from d, where d = (d1 . . . dD)T, di = yi (xref ) + ǫi; ǫi ∼ N(0, σ2

i ),

yi(x) = M

m=1 cimx1+nim m

; |nim| ‘not too big’

slide-23
SLIDE 23

Focus on Diff. in Data Handling – Remove Other Diff.

slide-24
SLIDE 24

Focus on Diff. in Data Handling – Remove Other Diff.

Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains

slide-25
SLIDE 25

Focus on Diff. in Data Handling – Remove Other Diff.

Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains Kalman gain (global) K = Cxy (Cyy + Cd)−1

slide-26
SLIDE 26

Focus on Diff. in Data Handling – Remove Other Diff.

Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains Kalman gain (global) K = Cxy (Cyy + Cd)−1 Replace Cxy by CxG T

e and Cyy by GeCxG T e

slide-27
SLIDE 27

Focus on Diff. in Data Handling – Remove Other Diff.

Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains Kalman gain (global) K = Cxy (Cyy + Cd)−1 Replace Cxy by CxG T

e and Cyy by GeCxG T e

→ Local gains Ke = CxG T

e

  • GeCxG T

e + Cd

−1

slide-28
SLIDE 28

Additional Assumptions facilitating Asymptotic Calculations

slide-29
SLIDE 29

Additional Assumptions facilitating Asymptotic Calculations

I consider the updates of a single (arbitrary) ensemble member

slide-30
SLIDE 30

Additional Assumptions facilitating Asymptotic Calculations

I consider the updates of a single (arbitrary) ensemble member Assumptions Univariate x → yi(x) = x1+ni

slide-31
SLIDE 31

Additional Assumptions facilitating Asymptotic Calculations

I consider the updates of a single (arbitrary) ensemble member Assumptions Univariate x → yi(x) = x1+ni Negligible data error → di = yi (xref ),

slide-32
SLIDE 32

Additional Assumptions . . . (continued)

slide-33
SLIDE 33

Additional Assumptions . . . (continued)

So far

|n1| ∧ |n2| ‘not too big’

xref d1 d2 y1 (x) y2 (x)

slide-34
SLIDE 34

Additional Assumptions . . . (continued)

So far I assume

|n1| ∧ |n2| ‘not too big’ n1 = n2 = n, |n| ≪ 1

xref d1 d2 y1 (x) y2 (x) xref d, d y (x) , y (x)

slide-35
SLIDE 35

Results from Asymptotic Calculations to O(n)

slide-36
SLIDE 36

Results from Asymptotic Calculations to O(n)

xRML = d − (d ln d)n + O(n2)

slide-37
SLIDE 37

Results from Asymptotic Calculations to O(n)

xRML = d − (d ln d)n + O(n2) Define ∆method = |xmethod − xRML|

slide-38
SLIDE 38

Results from Asymptotic Calculations to O(n)

xRML = d − (d ln d)n + O(n2) Define ∆method = |xmethod − xRML| Q = |d(ln d − ln xprior) − (d − xprior)|

slide-39
SLIDE 39

Results from Asymptotic Calculations to O(n)

xRML = d − (d ln d)n + O(n2) Define ∆method = |xmethod − xRML| Q = |d(ln d − ln xprior) − (d − xprior)| ∆ES = Qn + O(n2)

slide-40
SLIDE 40

Results from Asymptotic Calculations to O(n)

xRML = d − (d ln d)n + O(n2) Define ∆method = |xmethod − xRML| Q = |d(ln d − ln xprior) − (d − xprior)| ∆ES = Qn + O(n2) ∆ESMDA = A−1Qn + O(n2)

slide-41
SLIDE 41

Results from Asymptotic Calculations to O(n)

xRML = d − (d ln d)n + O(n2) Define ∆method = |xmethod − xRML| Q = |d(ln d − ln xprior) − (d − xprior)| ∆ES = Qn + O(n2) ∆ESMDA = A−1Qn + O(n2) ∆HIEnKF = D−1Qn + O(n2)

slide-42
SLIDE 42

Results from Asymptotic Calculations to O(n)

xRML = d − (d ln d)n + O(n2) Define ∆method = |xmethod − xRML| Q = |d(ln d − ln xprior) − (d − xprior)| ∆ES = Qn + O(n2) ∆ESMDA = A−1Qn + O(n2) ∆HIEnKF = D−1Qn + O(n2) ∆HIEnKFMDA = (AD)−1Qn + O(n2)

slide-43
SLIDE 43

Results from Asymptotic Calculations to O(n)

∆ES ≈ Qn ∆ESMDA ≈ A−1Qn ∆HIEnKF ≈ D−1Qn ∆HIEnKFMDA ≈ (AD)−1Qn

2 4 6 8 10 12 14 16 18 20 dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4

D

slide-44
SLIDE 44

Numerical Results with Full Methods

2 4 6 8 10 12 14 16 18 20 dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4

D

slide-45
SLIDE 45

Numerical Results with Full Methods

2 4 6 8 10 12 14 16 18 20 dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4

D

‘Ranking’ stable for n ∈ [−0.5, 5]

slide-46
SLIDE 46
  • Num. Calc. with Full Methods and Relaxed Assumptions

d = (d1 . . . dD)T, di = yi (xref ) + ǫi; ǫi ∼ N(0, σ2

i ),

yi(x) = M

m=1 cimx1+nim m

; ¯ nim = 0.4

slide-47
SLIDE 47
  • Num. Calc. with Full Methods and Relaxed Assumptions

d = (d1 . . . dD)T, di = yi (xref ) + ǫi; ǫi ∼ N(0, σ2

i ),

yi(x) = M

m=1 cimx1+nim m

; ¯ nim = 0.4 M = 1, 2, 5.

slide-48
SLIDE 48
  • Num. Calc. with Full Methods and Relaxed Assumptions

d = (d1 . . . dD)T, di = yi (xref ) + ǫi; ǫi ∼ N(0, σ2

i ),

yi(x) = M

m=1 cimx1+nim m

; ¯ nim = 0.4 M = 1, 2, 5. Draw 300 realizations of xref , xprior, nim, cim

slide-49
SLIDE 49
  • Num. Res. with Full Methods and Relaxed Assumptions

2 4 6 8 10 12 14 16 18 20 dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4

D M = 1, Arbitrary realization

2 4 6 8 10 12 14 16 18 20 dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4

D M = 1, Mean

slide-50
SLIDE 50
  • Num. Res. with Full Methods and Relaxed Assumptions

2 4 6 8 10 12 14 16 18 20 dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4

D M = 2, Arbitrary realization

2 4 6 8 10 12 14 16 18 20 dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4

D M = 2, Mean

slide-51
SLIDE 51
  • Num. Res. with Full Methods and Relaxed Assumptions

2 4 6 8 10 12 14 16 18 20 dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4

D M = 5, Arbitrary realization

2 4 6 8 10 12 14 16 18 20 dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4

D M = 5, Mean

slide-52
SLIDE 52

Summary

slide-53
SLIDE 53

Summary

Compared five different ways to assimilate data on simplistic, weakly non-linear parameter estimation problem

slide-54
SLIDE 54

Summary

Compared five different ways to assimilate data on simplistic, weakly non-linear parameter estimation problem Asymptotic calculations to first order in non-linearity strength (relying on further simplifications) reveals nature of similarity with iterative methods for (local-gain) ESMDA, HIEnKF and HIEnKFMDA methods

slide-55
SLIDE 55

Summary

Compared five different ways to assimilate data on simplistic, weakly non-linear parameter estimation problem Asymptotic calculations to first order in non-linearity strength (relying on further simplifications) reveals nature of similarity with iterative methods for (local-gain) ESMDA, HIEnKF and HIEnKFMDA methods Numerical results with full (local-gain) methods and relaxed assumptions support asymptotic calculations for low values of M