variational approach to data
play

Variational approach to data assimilation: optimization aspects and - PowerPoint PPT Presentation

Variational approach to data assimilation: optimization aspects and adjoint method Eric Blayo University of Grenoble and INRIA A Objectives I introduce data assimilation as an optimization problem I discuss the di ff erent forms of the


  1. Variational approach to data assimilation: optimization aspects and adjoint method Eric Blayo University of Grenoble and INRIA A

  2. Objectives I introduce data assimilation as an optimization problem I discuss the di ff erent forms of the objective functions I discuss their properties w.r.t. optimization I introduce the adjoint technique for the computation of the gradient Link with statistical methods: cf lectures by E. Cosme Variational data assimilation algorithms, tangent and adjoint codes: cf lectures by M. Nodet and A. Vidard E. Blayo - Variational approach to data assimilation

  3. Introduction: model problem Outline Introduction: model problem Definition and minimization of the cost function The adjoint method E. Blayo - Variational approach to data assimilation

  4. Introduction: model problem Model problem Two di ff erent available measurements of a single quantity. Which estimation of its true value ? � ! least squares approach E. Blayo - Variational approach to data assimilation

  5. Introduction: model problem Model problem Two di ff erent available measurements of a single quantity. Which estimation of its true value ? � ! least squares approach 2 obs y 1 = 19 � C and y 2 = 21 � C of the (unknown) Example present temperature x . ⇥ ( x � y 1 ) 2 + ( x � y 2 ) 2 ⇤ I Let J ( x ) = 1 2 x = y 1 + y 2 I Min x J ( x ) = 20 � C � ! ˆ 2 E. Blayo - Variational approach to data assimilation

  6. Introduction: model problem Model problem If 6 = units: y 1 = 66 . 2 � F and y 2 = 69 . 8 � F Observation operator I Let H ( x ) = 9 5 x + 32 I Let J ( x ) = 1 ( H ( x ) � y 1 ) 2 + ( H ( x ) � y 2 ) 2 ⇤ ⇥ 2 x = 20 � C I Min x J ( x ) � ! ˆ E. Blayo - Variational approach to data assimilation

  7. Introduction: model problem Model problem If 6 = units: y 1 = 66 . 2 � F and y 2 = 69 . 8 � F Observation operator I Let H ( x ) = 9 5 x + 32 I Let J ( x ) = 1 ( H ( x ) � y 1 ) 2 + ( H ( x ) � y 2 ) 2 ⇤ ⇥ 2 x = 20 � C I Min x J ( x ) � ! ˆ Drawback # 1: if observation units are inhomogeneous y 1 = 66 . 2 � F and y 2 = 21 � C I J ( x ) = 1 ( H ( x ) � y 1 ) 2 + ( x � y 2 ) 2 ⇤ ⇥ x = 19 . 47 � C !! � ! ˆ 2 E. Blayo - Variational approach to data assimilation

  8. Introduction: model problem Model problem If 6 = units: y 1 = 66 . 2 � F and y 2 = 69 . 8 � F Observation operator I Let H ( x ) = 9 5 x + 32 I Let J ( x ) = 1 ( H ( x ) � y 1 ) 2 + ( H ( x ) � y 2 ) 2 ⇤ ⇥ 2 x = 20 � C I Min x J ( x ) � ! ˆ Drawback # 1: if observation units are inhomogeneous y 1 = 66 . 2 � F and y 2 = 21 � C I J ( x ) = 1 ( H ( x ) � y 1 ) 2 + ( x � y 2 ) 2 ⇤ ⇥ x = 19 . 47 � C !! � ! ˆ 2 Drawback # 2: if observation accuracies are inhomogeneous x = 2 y 1 + y 2 = 19 . 67 � C If y 1 is twice more accurate than y 2 , one should obtain ˆ 3 "✓ x � y 1 ◆ 2 # ◆ 2 ✓ x � y 2 ! J should be J ( x ) = 1 � + 2 1 / 2 1 E. Blayo - Variational approach to data assimilation

  9. Introduction: model problem Model problem General form  ( H 1 ( x ) � y 1 ) 2 � + ( H 2 ( x ) � y 2 ) 2 Minimize J ( x ) = 1 σ 2 σ 2 2 1 2 E. Blayo - Variational approach to data assimilation

  10. Introduction: model problem Model problem General form  ( H 1 ( x ) � y 1 ) 2 � + ( H 2 ( x ) � y 2 ) 2 Minimize J ( x ) = 1 σ 2 σ 2 2 1 2 ( x � y 1 ) 2 ( x � y 2 ) 2 J ( x ) = 1 + 1 If H 1 = H 2 = Id : σ 2 σ 2 2 2 1 2 1 y 1 + 1 y 2 σ 2 σ 2 1 2 which leads to x = ˆ (weighted average) 1 1 + σ 2 σ 2 1 2 E. Blayo - Variational approach to data assimilation

  11. Introduction: model problem Model problem General form  ( H 1 ( x ) � y 1 ) 2 � + ( H 2 ( x ) � y 2 ) 2 Minimize J ( x ) = 1 σ 2 σ 2 2 1 2 ( x � y 1 ) 2 ( x � y 2 ) 2 J ( x ) = 1 + 1 If H 1 = H 2 = Id : σ 2 σ 2 2 2 1 2 1 y 1 + 1 y 2 σ 2 σ 2 1 2 which leads to x = ˆ (weighted average) 1 1 + σ 2 σ 2 1 2 1 1 x )] � 1 Remark: J ”(ˆ x ) = + = [ Var (ˆ (cf BLUE) σ 2 σ 2 | {z } | {z } 1 2 accuracy convexity E. Blayo - Variational approach to data assimilation

  12. Introduction: model problem Model problem Alternative formulation: background + observation If one considers that y 1 is a prior (or background ) estimate x b for x , and y 2 = y is an independent observation, then: ( x � x b ) 2 ( x � y ) 2 J ( x ) = 1 + 1 σ 2 σ 2 2 2 o b | {z } | {z } J o J b and 1 x b + 1 y σ 2 σ 2 σ 2 o b b x = ˆ = x b + ( y � x b ) 1 1 σ 2 b + σ 2 | {z } o + innovation σ 2 σ 2 | {z } o b gain E. Blayo - Variational approach to data assimilation

  13. Definition and minimization of the cost function Outline Introduction: model problem Definition and minimization of the cost function Least squares problems Linear (time independent) problems The adjoint method E. Blayo - Variational approach to data assimilation

  14. Definition and minimization of the cost function Least squares problems Outline Introduction: model problem Definition and minimization of the cost function Least squares problems Linear (time independent) problems The adjoint method E. Blayo - Variational approach to data assimilation

  15. Definition and minimization of the cost function Least squares problems Generalization: arbitrary number of unknowns and observations 0 1 x 1 . R n B C . To be estimated: x = A 2 I . @ x n 0 1 y 1 . R p B C . Observations: y = A 2 I . @ y p R n � R p Observation operator: y ⌘ H ( x ) , with H : I ! I E. Blayo - Variational approach to data assimilation

  16. Definition and minimization of the cost function Least squares problems Generalization: arbitrary number of unknowns and observations A simple example of observation operator 0 1 x 1 ✓ ◆ x 2 an observation of x 1 + x 2 B C If x = and y = 2 B C x 3 an observation of x 4 @ A x 4 0 1 1 1 0 0 then H ( x ) = Hx with H = 2 2 @ A 0 0 0 1 E. Blayo - Variational approach to data assimilation

  17. Definition and minimization of the cost function Least squares problems Generalization: arbitrary number of unknowns and observations 0 1 x 1 . R n B . C To be estimated: x = A 2 I . @ x n 0 1 y 1 . R p B C . Observations: y = A 2 I . @ y p R n � R p Observation operator: y ⌘ H ( x ) , with H : I ! I Cost function: J ( x ) = 1 2 k H ( x ) � y k 2 with k . k to be chosen. E. Blayo - Variational approach to data assimilation

  18. Definition and minimization of the cost function Least squares problems Reminder: norms and scalar products 0 1 u 1 . R n B C . u = A 2 I . @ u n n X ⌘ Euclidian norm: k u k 2 = u T u = u 2 i i = 1 n X Associated scalar product: ( u , v ) = u T v = u i v i i = 1 ⌘ Generalized norm: let M a symmetric positive definite matrix n n X X M -norm: k u k 2 M = u T M u = m ij u i u j i = 1 j = 1 n n X X Associated scalar product: ( u , v ) M = u T M v = m ij u i v j i = 1 j = 1 E. Blayo - Variational approach to data assimilation

  19. Definition and minimization of the cost function Least squares problems Generalization: arbitrary number of unknowns and observations 0 1 x 1 . R n B . C To be estimated: x = A 2 I . @ x n 0 1 y 1 . R p B . C Observations: y = A 2 I . @ y p R n � R p Observation operator: y ⌘ H ( x ) , with H : I ! I Cost function: J ( x ) = 1 2 k H ( x ) � y k 2 with k . k to be chosen. E. Blayo - Variational approach to data assimilation

  20. Definition and minimization of the cost function Least squares problems Generalization: arbitrary number of unknowns and observations 0 1 x 1 . R n B . C To be estimated: x = A 2 I . @ x n 0 1 y 1 . R p B . C Observations: y = A 2 I . @ y p R n � R p Observation operator: y ⌘ H ( x ) , with H : I ! I Cost function: J ( x ) = 1 2 k H ( x ) � y k 2 with k . k to be chosen. (Intuitive) necessary (but not su ffi cient) condition for the existence of a unique minimum: p � n E. Blayo - Variational approach to data assimilation

  21. Definition and minimization of the cost function Least squares problems Formalism “background value + new observations” ◆ ✓ x b � background Y = � new obs y The cost function becomes: 1 1 2 k x � x b k 2 2 k H ( x ) � y k 2 J ( x ) = + o b | {z } | {z } J b J o E. Blayo - Variational approach to data assimilation

  22. Definition and minimization of the cost function Least squares problems Formalism “background value + new observations” ◆ ✓ x b � background Y = � new obs y The cost function becomes: 1 1 2 k x � x b k 2 2 k H ( x ) � y k 2 J ( x ) = + o b | {z } | {z } J b J o ( x � x b ) T B � 1 ( x � x b ) + ( H ( x ) � y ) T R � 1 ( H ( x ) � y ) = E. Blayo - Variational approach to data assimilation

  23. Definition and minimization of the cost function Least squares problems Formalism “background value + new observations” ◆ ✓ x b � background Y = � new obs y The cost function becomes: 1 1 2 k x � x b k 2 2 k H ( x ) � y k 2 J ( x ) = + o b | {z } | {z } J b J o ( x � x b ) T B � 1 ( x � x b ) + ( H ( x ) � y ) T R � 1 ( H ( x ) � y ) = The necessary condition for the existence of a unique minimum ( p � n ) is automatically fulfilled. E. Blayo - Variational approach to data assimilation

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend