math 676 finite element methods in scientifjc computing
play

MATH 676 Finite element methods in scientifjc computing Wolfgang - PowerPoint PPT Presentation

MATH 676 Finite element methods in scientifjc computing Wolfgang Bangerth, T exas A&M University http://www.dealii.org/ Wolfgang Bangerth Lecture 17.75: Generating adaptively refjned meshes: A posteriori error estimators


  1. MATH 676 – Finite element methods in scientifjc computing Wolfgang Bangerth, T exas A&M University http://www.dealii.org/ Wolfgang Bangerth

  2. Lecture 17.75: Generating adaptively refjned meshes: “A posteriori” error estimators http://www.dealii.org/ Wolfgang Bangerth

  3. Adaptive mesh refjnement (AMR) Adaptive mesh refjnement happens in a loop: SOLVE → ESTIMATE → MARK → REFINE ● SOLVE: Assemble linear system, solve it ● ESTIMATE: Compute a refjnement indicator for each cell ● MARK: Determine which cells should be refjned ● REFINE: Refjne these cells (and possibly others) http://www.dealii.org/ Wolfgang Bangerth

  4. The ESTIMATE phase ESTIMATE phase: Compute a “refjnement indicator” for each cell K : η = {η K 1 , η K 2 , η K 3 , ... , η K N } Recall from lecture 17.25: A “reasonable” approach is to estimate the interpolation error of the solution, which leads to the Kelly indicator: 1 / 2 1 / 2 ( ∫ ∂ K |[∇ u h ]| 2 ) η K = h K http://www.dealii.org/ Wolfgang Bangerth

  5. Ideas The “ideal” refjnement indicator would be the actual error e K : 1 / 2 η K : = e K = ‖ u − u h ‖ K = ( ∫ K | u − u h | 2 dx ) Questions/problems: 1.Would a difgerent norm be more appropriate? 2.But we don't have the exact solution u(x) ! 3.Does refjning a cell decrease the overall error? http://www.dealii.org/ Wolfgang Bangerth

  6. Ideas Question 3: Does refjning a cell decrease the overall error? Answers: ● It depends on the norm in which we measure the error ● For interpolation , refjning one cell only afgects the error on this cell ● For the Galerkin projection , refjning one cell afgects the error everywhere This is called “error pollution”. http://www.dealii.org/ Wolfgang Bangerth

  7. Ideas Question 3: Does refjning a cell decrease the overall error? Consider the Laplace equation: ● Let T be a mesh, T + be any refjnement ● Let V h be the FE space on T , V h + be the FE space on T + ● Then + V h ⊂ V h ● Recall E : = ‖∇ ( u − u h )‖ = min v h ∈ V h ‖∇ ( u − v h )‖ E + : = ‖∇ ( u − u h + )‖ = min v h ∈ V h + ‖∇( u − v h )‖ ● Consequently: + ≤ E E http://www.dealii.org/ Wolfgang Bangerth

  8. Ideas Question 3: Does refjning a cell decrease the overall error? But: ● It is not necessarily true that if + ≤ E E : = ‖ u − u h ‖ E ● The improvement may be marginal T o show that one needs to refjne enough + ≤ θ E, θ< 1 E cells. Strategy: Dörfmer marking / bulk refjnement marks the cells with the largest errors so that ∑ K ∈ marked cells η K ≥ α ∑ K ∈ T η K , 0 <α≤ 1 http://www.dealii.org/ Wolfgang Bangerth

  9. A posteriori error estimation Question 2: For evaluating the true error 1 / 2 η K : = e K = ‖ u − u h ‖ K = ( ∫ K | u − u h | 2 dx ) we would need to know the exact solution u(x) . Answer: Yes. This is the central problem. We need to fjnd formulas that help us estimate the error only in term of u h (x). This is called “a posteriori” error estimation. http://www.dealii.org/ Wolfgang Bangerth

  10. A posteriori error estimation A posteriori error estimation for Laplace's equation: Consider −Δ u = f , u ∣ ∂Ω = 0. Start with Galerkin orthogonality: (∇ u, ∇ v )=( f ,v ) ∀ v ∈ V (∇ u h , ∇ v h )=( f ,v h ) ∀ v h ∈ V h ⊂ V ⏟ (∇( u − u h ) , ∇ v h )= 0 ∀ v h ∈ V h ⊂ V = : e http://www.dealii.org/ Wolfgang Bangerth

  11. A posteriori error estimation A posteriori error estimation for Laplace's equation: Then consider the energy norm of the error: 2 = (∇ e , ∇ e ) Ω = (∇ e, ∇( e − I h e )) Ω ‖∇ e ‖ Ω = ∑ K (∇ e, ∇( e − I h e )) K = ∑ K (−Δ e,e − I h e ) K +( n ⋅∇ e,e − I h e ) ∂ K Now use the following facts: ⏟ −Δ e = −Δ( u − u h ) = −Δ u +Δ u h = f +Δ u h cell residual ⏟ ⏟ ⏟ ( e − I h e )∣ ∂Ω = (( u − u h )−( I h u − I h u h ))∣ ∂ Ω = u ∣ ∂Ω − I h u ∣ ∂ Ω = 0 u h = 0 http://www.dealii.org/ Wolfgang Bangerth

  12. A posteriori error estimation A posteriori error estimation for Laplace's equation: Thus: 2 = ∑ K (−Δ e,e − I h e ) K +( n ⋅∇ e ,e − I h e ) ∂ K ‖∇ e ‖ Ω = ∑ K ( f +Δ u h ,e − I h e ) K +( n ⋅∇ e,e − I h e ) ∂ K ∖∂Ω The second term only integrates over interior faces. We visit each face twice (from K and from its neighbor K' ): 1 ∑ K ( n K ⋅∇ e ∣ K ,e − I h e ) ∂ K ∖∂ Ω = ∑ K 2 ( n K ⋅∇ e ∣ K ,e − I h e ) ∂ K ∖∂ Ω + 1 ⏟ 2 ( n K ' ⋅∇ e ∣ K ' ,e − I h e ) ∂ K ∖∂ Ω =− n K 1 = ∑ K 2 ( n K ⋅ (∇ e ∣ K −∇ e ∣ K ' ) ,e − I h e ) ∂ K ∖∂Ω http://www.dealii.org/ Wolfgang Bangerth

  13. A posteriori error estimation A posteriori error estimation for Laplace's equation: Consider the face term: 1 ∑ K ( n K ⋅∇ e ∣ K ,e − I h e ) ∂ K ∖∂ Ω = ∑ K 2 ( n K ⋅ (∇ e ∣ K −∇ e ∣ K ' ) ,e − I h e ) ∂ K ∖∂Ω Observe: ⏟ ⏟ ∇ e ∣ K −∇ e ∣ K ' = (∇ u ∣ K −∇ u ∣ K ' ) −(∇ u h ∣ K −∇ u h ∣ K ' ) = 0 because u is smooth = : [∇ u h ] http://www.dealii.org/ Wolfgang Bangerth

  14. A posteriori error estimation A posteriori error estimation for Laplace's equation: Thus: ,e − I h e ) K − 1 2 = ∑ K ( f +Δ u h ⏟ ⏟ ‖∇ e ‖ 2 ( n ⋅ [∇ u h ] ,e − I h e ) ∂ K ∖∂Ω Ω cell residual "jump" residual Using the Cauchy-Schwarz inequality on cell/face integrals: 2 ≤ ∑ K ( ‖ f +Δ u h ‖ ∂ K ∖∂Ω ) K ‖ e − I h e ‖ K + 1 ‖∇ e ‖ 2 ‖ n ⋅[∇ u h ]‖ ∂ K ∖∂ Ω ‖ e − I h e ‖ Ω http://www.dealii.org/ Wolfgang Bangerth

  15. A posteriori error estimation A posteriori error estimation for Laplace's equation: Thus: ,e − I h e ) K − 1 2 = ∑ K ( f +Δ u h ⏟ ⏟ ‖∇ e ‖ 2 ( n ⋅ [∇ u h ] ,e − I h e ) ∂ K ∖∂Ω Ω cell residual "jump" residual ≤ ∑ K ( ‖ f +Δ u h ‖ K ‖ e − I h e ‖ ∂ K ∖∂Ω ) K + 1 2 ‖ n ⋅ [∇ u h ]‖ ∂ K ∖∂Ω ‖ e − I h e ‖ 1 Because we have the following basic e ∈ H interpolation estimates: ‖ e − I h e ‖ K ≤ C h K ‖∇ e ‖ K 1 / 2 ‖∇ e ‖ K ‖ e − I h e ‖ ∂ K ∖∂Ω ≤ C h K http://www.dealii.org/ Wolfgang Bangerth

  16. A posteriori error estimation A posteriori error estimation for Laplace's equation: Thus: 2 ≤ ∑ K ( 1 / 2 ‖∇ e ‖ K ) + 1 ⏟ ⏟ ‖∇ e ‖ ‖ f +Δ u h ‖ K ‖ e − I h e ‖ 2 ‖ n ⋅[∇ u h ]‖ ∂ K ∖∂Ω ‖ e − I h e ‖ Ω ∂ K ∖∂ Ω K ≤ Ch K ‖∇ e ‖ ≤ Ch K K ≤ C ∑ K ( h K ‖ f +Δ u h ‖ K + 1 ∂ K ∖∂Ω ) ‖∇ e ‖ 1 / 2 ‖ n ⋅[∇ u h ]‖ 2 h K K 1 / 2 ( ∑ y i 1 / 2 ∑ i x i y i ≤ ( ∑ x i 2 ) 2 ) Recall Cauchy-Schwarz inequality: Apply this to the sum over cells: 2 ≤ C [ ∑ K ( h K ‖ f +Δ u h ‖ K + 1 ] 2 ] 1 / 2 ∂ K ∖∂Ω ) 2 1 / 2 [ ∑ K ‖∇ e ‖ K 1 / 2 ‖ n ⋅[∇ u h ]‖ ‖∇ e ‖ 2 h K ⏟ Ω ‖∇ e ‖ Ω http://www.dealii.org/ Wolfgang Bangerth

  17. A posteriori error estimation A posteriori error estimation for Laplace's equation: Thus: Ω ≤ C [ ∑ K ( h K ‖ f +Δ u h ‖ K + 1 ] 1 / 2 ∂ K ∖∂Ω ) 2 1 / 2 ‖ n ⋅[∇ u h ]‖ ‖∇ e ‖ 2 h K Interpretation: We have bounded the error in terms of only the (computable!) discrete solution! http://www.dealii.org/ Wolfgang Bangerth

  18. A posteriori error estimation A posteriori error estimation for Laplace's equation: Last reformulation, squaring both sides: 2 ≤ C ∑ K ( h K ‖ f +Δ u h ‖ K + 1 ∂ K ∖∂Ω ) 2 1 / 2 ‖ n ⋅ ‖∇ e ‖ 2 h K [∇ u h ]‖ Ω ⏟ 2 = : η K Nomenclature: ● We call the “residual-based” error estimator for η K cell K . ● It consists of the norm of the “cell residual” and the norm of the “jump residual”/”face residual”. http://www.dealii.org/ Wolfgang Bangerth

  19. A posteriori error estimation Note: If we approximate 2 ≤ C ∑ K ( h K ‖ f +Δ u h ‖ K + 1 ∂ K ∖∂Ω ) 2 1 / 2 ‖ n ⋅ ‖∇ e ‖ 2 h K [∇ u h ]‖ Ω ≈ C ' ∑ K h K ‖ n ⋅[∇ u h ]‖ 2 ∂ K ∖∂Ω then we get exactly the “Kelly” estimator! Reason: For odd polynomial degrees the jump residual dominates the cell residual. But: For even polynomial degrees, it is the other way around! http://www.dealii.org/ Wolfgang Bangerth

  20. Some conclusions Conclusion 1: It is possible to bound the (unknown) error exclusively in terms of the known, computed solution. For example, for the Laplace equation we get: 2 ≤ C ∑ K ( h K ‖ f +Δ u h ‖ K + 1 ∂ K ∖∂ Ω ) 2 1 / 2 ‖ n ⋅ ‖∇ e ‖ [∇ u h ]‖ 2 h K Ω The estimate involves “cell” and “jump” residuals. Note: Similar estimates can often be derived for other equation as well! http://www.dealii.org/ Wolfgang Bangerth

  21. Some conclusions Conclusion 2: Estimates typically involve unknown constants. For example, for the Laplace equation we get: 2 ≤ C ∑ K ( h K ‖ f +Δ u h ‖ K + 1 ∂ K ∖∂ Ω ) 2 1 / 2 ‖ n ⋅ ‖∇ e ‖ [∇ u h ]‖ 2 h K Ω Here, the constant comes from the interpolation estimates. It is usually poorly known. Note: For other equations, there are also stability constants, and constants from yet other estimates. http://www.dealii.org/ Wolfgang Bangerth

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend