Finite Difference Method Motivation For a given smooth function , - - PowerPoint PPT Presentation

β–Ά
finite difference method motivation
SMART_READER_LITE
LIVE PREVIEW

Finite Difference Method Motivation For a given smooth function , - - PowerPoint PPT Presentation

Finite Difference Method Motivation For a given smooth function , we want to calculate the derivative at a given value of . Suppose we dont know how to compute the analytical expression for , or it is


slide-1
SLIDE 1

Finite Difference Method

slide-2
SLIDE 2

Motivation

For a given smooth function 𝑔 𝑦 , we want to calculate the derivative 𝑔′ 𝑦 at a given value of 𝑦. Suppose we don’t know how to compute the analytical expression for 𝑔′ 𝑦 ,

  • r it is computationally very expensive. However you do know how to evaluate

the function value: We know that:

𝑔′ 𝑦 = lim

!β†’#

𝑔 𝑦 + β„Ž βˆ’ 𝑔(𝑦) β„Ž

Can we just use 𝑔′ 𝑦 β‰ˆ

! "#$ %! " $

as an approximation? How do we choose β„Ž? Can we get estimate the error of our approximation?

slide-3
SLIDE 3

For a differentiable function 𝑔: β„› β†’ β„›, the derivative is defined as:

𝑔′ 𝑦 = lim

!β†’#

𝑔 𝑦 + β„Ž βˆ’ 𝑔(𝑦) β„Ž

Taylor Series centered at 𝑦, where Μ… 𝑦 = 𝑦 + β„Ž

𝑔 𝑦 + β„Ž = 𝑔 𝑦 + 𝑔$ 𝑦 β„Ž + 𝑔′′ 𝑦

!! % +𝑔′′′ 𝑦 !" & + β‹―

𝑔 𝑦 + β„Ž = 𝑔 𝑦 + 𝑔$ 𝑦 β„Ž + 𝑃(β„Ž%)

We define the Forward Finite Difference as: Therefore, the truncation error of the forward finite difference approximation is bounded by:

Finite difference method

slide-4
SLIDE 4

In a similar way, we can write: 𝑔 𝑦 βˆ’ β„Ž = 𝑔 𝑦 βˆ’ 𝑔! 𝑦 β„Ž + 𝑃(β„Ž") β†’ 𝑔! 𝑦 = 𝑔 𝑦 βˆ’ 𝑔 𝑦 βˆ’ β„Ž β„Ž + 𝑃(β„Ž) And define the Backward Finite Difference as: 𝑒𝑔 𝑦 = 𝑔 𝑦 βˆ’ 𝑔 𝑦 βˆ’ β„Ž β„Ž β†’ 𝑔! 𝑦 = 𝑒𝑔 𝑦 + 𝑃(β„Ž) And subtracting the two Taylor approximations 𝑔 𝑦 + β„Ž = 𝑔 𝑦 + 𝑔! 𝑦 β„Ž + 𝑔′′ 𝑦

#! " +𝑔′′′ 𝑦 #" $ + β‹―

𝑔 𝑦 βˆ’ β„Ž = 𝑔 𝑦 βˆ’ 𝑔! 𝑦 β„Ž + 𝑔′′ 𝑦

#! " βˆ’π‘”β€²β€²β€² 𝑦 #" $ + β‹―

𝑔 𝑦 + β„Ž βˆ’ 𝑔 𝑦 βˆ’ β„Ž = 2𝑔! 𝑦 β„Ž + 𝑔′′′ 𝑦 β„Ž% 6 + 𝑃(β„Ž&) 𝑔! 𝑦 = 𝑔 𝑦 + β„Ž βˆ’ 𝑔 𝑦 βˆ’ β„Ž 2β„Ž + 𝑃(β„Ž") And define the Central Finite Difference as: 𝑒𝑔 𝑦 = 𝑦 + β„Ž βˆ’ 𝑔 𝑦 βˆ’ β„Ž 2β„Ž β†’ 𝑔! 𝑦 = 𝑒𝑔 𝑦 + 𝑃(β„Ž")

slide-5
SLIDE 5

Forward Finite Difference: 𝑒𝑔 𝑦 =

' ()# *' ( #

β†’ 𝑔! 𝑦 = 𝑒𝑔 𝑦 + 𝑃(β„Ž) Backward Finite Difference: 𝑒𝑔 𝑦 =

' ( *' (*# #

β†’ 𝑔! 𝑦 = 𝑒𝑔 𝑦 + 𝑃(β„Ž) Central Finite Difference: 𝑒𝑔 𝑦 = ' ()# *' (*#

"#

β†’ 𝑔! 𝑦 = 𝑒𝑔 𝑦 + 𝑃(β„Ž") How accurate is the finite difference approximation? How many function evaluations (in additional to 𝑔 𝑦 )? Our typical trade-off issue! We can get better accuracy with Central Finite Difference with the (possible) increased computational cost.

How small should the value of π’Š?

Truncation error: 𝑃(β„Ž) Cost: 1 function evaluation Truncation error: 𝑃(β„Ž) Cost: 1 function evaluation Truncation error: 𝑃(β„Ž") Cost: 2 function evaluation2

slide-6
SLIDE 6

Example

𝑔 𝑦 = 𝑓$ βˆ’ 2 𝑔′ 𝑦 = 𝑓$ π‘’π‘”π‘π‘žπ‘žπ‘ π‘π‘¦ = (𝑓$%!βˆ’2) βˆ’ (𝑓$βˆ’2) β„Ž 𝑓𝑠𝑠𝑝𝑠(β„Ž) = 𝑏𝑐𝑑(𝑔′ 𝑦 βˆ’ π‘’π‘”π‘π‘žπ‘žπ‘ π‘π‘¦) We want to obtain an approximation for 𝑔′ 1

β„Ž 𝑓𝑠𝑠𝑝𝑠

Truncation error

slide-7
SLIDE 7

Example

Should we just keep decreasing the perturbation β„Ž, in order to approach the limit β„Ž β†’ 0 and

  • btain a better approximation for the derivative?
slide-8
SLIDE 8

Uh-Oh!

What happened here? 𝑔 𝑦 = 𝑓$ βˆ’ 2, 𝑔′ 𝑦 = 𝑓$ β†’ 𝑔′ 1 β‰ˆ 2.7

𝑒𝑔 1 = 𝑔 1 + β„Ž βˆ’ 𝑔(1) β„Ž

Forward Finite Difference

slide-9
SLIDE 9

𝑒𝑔(𝑦) = 𝑔 𝑦 + β„Ž βˆ’ 𝑔(𝑦) β„Ž ≀ πœ—+ |𝑔 𝑦 | β„Ž

When computing the finite difference approximation, we have two competing source of errors: Truncation errors and Rounding errors

slide-10
SLIDE 10

Optimal β€œh” Loss of accuracy due to rounding

𝑓𝑠𝑠𝑝𝑠~𝑁 β„Ž

Truncation error: Rounding error:

𝑓𝑠𝑠𝑝𝑠~ πœ—&|𝑔 𝑦 | β„Ž

Minimize the total error 𝑓𝑠𝑠𝑝𝑠 ~ πœ—+|𝑔 𝑦 | β„Ž + π‘β„Ž Gives β„Ž = πœ—+|𝑔 𝑦 |/𝑁

slide-11
SLIDE 11