Nonlinear Equations and Continuous Optimization Sanzheng Qiao - - PowerPoint PPT Presentation

nonlinear equations and continuous optimization
SMART_READER_LITE
LIVE PREVIEW

Nonlinear Equations and Continuous Optimization Sanzheng Qiao - - PowerPoint PPT Presentation

Intro Bisection Newton Systems Optimization Software Summary Nonlinear Equations and Continuous Optimization Sanzheng Qiao Department of Computing and Software McMaster University March, 2014 Intro Bisection Newton Systems


slide-1
SLIDE 1

Intro Bisection Newton Systems Optimization Software Summary

Nonlinear Equations and Continuous Optimization

Sanzheng Qiao

Department of Computing and Software McMaster University

March, 2014

slide-2
SLIDE 2

Intro Bisection Newton Systems Optimization Software Summary

Outline

1

Introduction

2

Bisection Method

3

Newton’s Method

4

Systems of Nonlinear Equations

5

Continuous Optimization

6

Software Packages

slide-3
SLIDE 3

Intro Bisection Newton Systems Optimization Software Summary

Outline

1

Introduction

2

Bisection Method

3

Newton’s Method

4

Systems of Nonlinear Equations

5

Continuous Optimization

6

Software Packages

slide-4
SLIDE 4

Intro Bisection Newton Systems Optimization Software Summary

Problem setting

Find roots f(x) = 0 Often, methods are iterative (roots cannot be found in finite number of steps). Example Compute square roots x2 − A = 0

slide-5
SLIDE 5

Intro Bisection Newton Systems Optimization Software Summary

Problem setting

Find roots f(x) = 0 Often, methods are iterative (roots cannot be found in finite number of steps). Example Compute square roots x2 − A = 0 Find the side of the square whose area is A

slide-6
SLIDE 6

Intro Bisection Newton Systems Optimization Software Summary

Compute square roots

Start with a rectangle whose one side is xc, then the other side is A/xc so that its area is A. Make the rectangle “more square” by setting the new side: x+ = 1 2

  • xc + A

xc

  • Then xc = x+ and iterate.
slide-7
SLIDE 7

Intro Bisection Newton Systems Optimization Software Summary

Compute square roots

Start with a rectangle whose one side is xc, then the other side is A/xc so that its area is A. Make the rectangle “more square” by setting the new side: x+ = 1 2

  • xc + A

xc

  • Then xc = x+ and iterate.

A better form x+ = xc − 1 2

  • xc − A

xc

slide-8
SLIDE 8

Intro Bisection Newton Systems Optimization Software Summary

Compute square roots

Three issues to be addressed Initialization (x0) Convergence (xk → x∗?) and rate (how fast?) Termination

slide-9
SLIDE 9

Intro Bisection Newton Systems Optimization Software Summary

Initialization

Write A in base 4: A = m × 4e, 0.25 ≤ m < 1 then √ A = √m × 2e. Now we can assume 4−1 ≤ A < 1. Linear interpolation of f(A) = √ A at A = 0.25, 1.0: p(A) = (1 + 2A)/3.

0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1

slide-10
SLIDE 10

Intro Bisection Newton Systems Optimization Software Summary

Initialization (cont.)

Initial error bound: Differentiating √ A − 1 + 2A 3 with respect to A and then setting the derivative to zero to find the maximum, it can be shown that

A − (1 + 2A)/3

  • ≤ 0.05
slide-11
SLIDE 11

Intro Bisection Newton Systems Optimization Software Summary

Initialization (cont.)

Initial error bound: Differentiating √ A − 1 + 2A 3 with respect to A and then setting the derivative to zero to find the maximum, it can be shown that

A − (1 + 2A)/3

  • ≤ 0.05

Initial value: x0 = (1 + 2A)/3 Initial error: e0 ≤ 0.05

slide-12
SLIDE 12

Intro Bisection Newton Systems Optimization Software Summary

Convergence

A relation between xk+1 and xk: xk+1 = 1 2

  • xk + A

xk

  • Denote the error ek = |xk −

√ A|, then the relation between ek+1 and ek: ek+1 = |xk+1 − √ A| = 1 2

  • xk −

√ A √xk 2 = 1 2|xk|e2

k

slide-13
SLIDE 13

Intro Bisection Newton Systems Optimization Software Summary

Convergence

A relation between xk+1 and xk: xk+1 = 1 2

  • xk + A

xk

  • Denote the error ek = |xk −

√ A|, then the relation between ek+1 and ek: ek+1 = |xk+1 − √ A| = 1 2

  • xk −

√ A √xk 2 = 1 2|xk|e2

k

It can be shown that 0.5 ≤ xk ≤ 1.0.

slide-14
SLIDE 14

Intro Bisection Newton Systems Optimization Software Summary

Convergence (cont.)

Since the initial error e0 ≤ 0.05, ek ≤ e2

k−1 ≤ · · · ≤ e2k 0 ≤ (0.05)2k

We have shown the convergence (ek → 0 as k → ∞).

slide-15
SLIDE 15

Intro Bisection Newton Systems Optimization Software Summary

Convergence (cont.)

Since the initial error e0 ≤ 0.05, ek ≤ e2

k−1 ≤ · · · ≤ e2k 0 ≤ (0.05)2k

We have shown the convergence (ek → 0 as k → ∞). How fast? Rate: quadratic ek+1 ≤ ce2

k, each iteration doubles the

accuracy.

slide-16
SLIDE 16

Intro Bisection Newton Systems Optimization Software Summary

Termination

Recall: ek ≤ (0.05)2k < 10−2k.

slide-17
SLIDE 17

Intro Bisection Newton Systems Optimization Software Summary

Termination

Recall: ek ≤ (0.05)2k < 10−2k. When k = 3, ek < 10−8. When k = 4, ek < 10−16.

slide-18
SLIDE 18

Intro Bisection Newton Systems Optimization Software Summary

Termination

Recall: ek ≤ (0.05)2k < 10−2k. When k = 3, ek < 10−8. When k = 4, ek < 10−16. Three iterations are enough for IEEE single precision (2−24). Four iterations are enough for IEEE double precision (2−53).

slide-19
SLIDE 19

Intro Bisection Newton Systems Optimization Software Summary

Example

Compute √ 3

slide-20
SLIDE 20

Intro Bisection Newton Systems Optimization Software Summary

Example

Compute √ 3 Scale: 3 = 0.75 × 41

slide-21
SLIDE 21

Intro Bisection Newton Systems Optimization Software Summary

Example

Compute √ 3 Scale: 3 = 0.75 × 41 Initial: x0 = (1 + 2 × 0.75)/3 = 2.5/3

slide-22
SLIDE 22

Intro Bisection Newton Systems Optimization Software Summary

Example

Compute √ 3 Scale: 3 = 0.75 × 41 Initial: x0 = (1 + 2 × 0.75)/3 = 2.5/3 Iterate: xn+1 = xn − (xn − 0.75/xn)/2 n xn error 0.8333... 3.3 × 10−2 1 0.8667... 6.4 × 10−4 2 0.8660... 2.4 × 10−7 3 0.8660... 3.2 × 10−14 4 0.8660... < 10−16 x5 = x4.

slide-23
SLIDE 23

Intro Bisection Newton Systems Optimization Software Summary

Example

Compute √ 3 Scale: 3 = 0.75 × 41 Initial: x0 = (1 + 2 × 0.75)/3 = 2.5/3 Iterate: xn+1 = xn − (xn − 0.75/xn)/2 n xn error 0.8333... 3.3 × 10−2 1 0.8667... 6.4 × 10−4 2 0.8660... 2.4 × 10−7 3 0.8660... 3.2 × 10−14 4 0.8660... < 10−16 x5 = x4. Scale back: x4 × 21

slide-24
SLIDE 24

Intro Bisection Newton Systems Optimization Software Summary

Outline

1

Introduction

2

Bisection Method

3

Newton’s Method

4

Systems of Nonlinear Equations

5

Continuous Optimization

6

Software Packages

slide-25
SLIDE 25

Intro Bisection Newton Systems Optimization Software Summary

Generic algorithm

If f(a) ∗ f(b) ≤ 0 and f(x) is continuous on [a, b], then f(x) has a root on [a, b]. while (b-a)>tol m = (a+b)/2; if f(a)*f(m)<=0 b = m; else a = m; end; end; r = (a + b)/2;

slide-26
SLIDE 26

Intro Bisection Newton Systems Optimization Software Summary

Generic algorithm

Two problems in the generic algorithm: The while-loop may not terminate.

slide-27
SLIDE 27

Intro Bisection Newton Systems Optimization Software Summary

Generic algorithm

Two problems in the generic algorithm: The while-loop may not terminate. When a and b are two neighboring floating-point numbers and (b-a)>tol, (a+b)/2 is rounded to either a or b.

slide-28
SLIDE 28

Intro Bisection Newton Systems Optimization Software Summary

Generic algorithm

Two problems in the generic algorithm: The while-loop may not terminate. When a and b are two neighboring floating-point numbers and (b-a)>tol, (a+b)/2 is rounded to either a or b. Redundant function evaluations.

slide-29
SLIDE 29

Intro Bisection Newton Systems Optimization Software Summary

An improved algorithm

fa = f(a); while (b-a)>tol + eps*max(|a|,|b|) m = (a + b)/2; fm = f(m); if fa*fm<=0 b = m; else a = m; fa = fm; end; end; r = (a + b)/2;

slide-30
SLIDE 30

Intro Bisection Newton Systems Optimization Software Summary

An improved algorithm

fa = f(a); while (b-a)>tol + eps*max(|a|,|b|) m = (a + b)/2; fm = f(m); if fa*fm<=0 b = m; else a = m; fa = fm; end; end; r = (a + b)/2; Note: eps*max(|a|,|b|) is about the distance between two consecutive floating-point numbers near max(|a|,|b|). (ulp)

slide-31
SLIDE 31

Intro Bisection Newton Systems Optimization Software Summary

Convergence

Since bk − ak ≤ (b − a)/2k, x∗ ∈ [ak, bk], and xk = (ak + bk)/2, we have ek = |xk − x∗| ≤ bk − ak 2 = b − a 2k+1 → 0 In this case, ek+1 ≤ 0.5ek. Improve accuracy by 1 bit per iteration or 1 decimal digit for every three or so iterations.

slide-32
SLIDE 32

Intro Bisection Newton Systems Optimization Software Summary

Convergence

In general, linear convergence rate: ek+1 ≤ cek for some constant c < 1.

slide-33
SLIDE 33

Intro Bisection Newton Systems Optimization Software Summary

Convergence

In general, linear convergence rate: ek+1 ≤ cek for some constant c < 1. Difficulty: Locate the interval [a, b].

slide-34
SLIDE 34

Intro Bisection Newton Systems Optimization Software Summary

Outline

1

Introduction

2

Bisection Method

3

Newton’s Method

4

Systems of Nonlinear Equations

5

Continuous Optimization

6

Software Packages

slide-35
SLIDE 35

Intro Bisection Newton Systems Optimization Software Summary

Idea

The tangent line of f(x) at xc: y = f(xc) + (x − xc)f ′(xc) Set y = 0

1 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2

  • 2
  • 1.5
  • 1
  • 0.5

0.5 1 1.5 2

slide-36
SLIDE 36

Intro Bisection Newton Systems Optimization Software Summary

Newton’s method

Newton’s method x+ = xc − f(xc) f ′(xc)

slide-37
SLIDE 37

Intro Bisection Newton Systems Optimization Software Summary

Newton’s method

Newton’s method x+ = xc − f(xc) f ′(xc) Example. Square root problem revisited, find a zero of f(x) = x2 − A x+ = xc − x2

c − A

2xc = xc − 1 2

  • xc − A

xc

slide-38
SLIDE 38

Intro Bisection Newton Systems Optimization Software Summary

Complex case

Example f(x) = x2 + x + 1 (zeros (−1 ± i √ 3)/2) i xi error i 5.2 × 10−1 1 −0.40000 + 0.80000i 1.2 × 10−1 2 −0.50769 + 0.86154i 8.9 × 10−3 3 −0.49996 + 0.86600i 4.6 × 10−5 4 −0.50000 + 0.86603i 1.2 × 10−9 5 −0.50000 + 0.86603i converge x6 = x5

slide-39
SLIDE 39

Intro Bisection Newton Systems Optimization Software Summary

Convergence

No guarantee of convergence (unlike bisection). For example, f(x) = atan(x), x+ = xc − (1 + x2

c )atan(xc)

x0 = 1.5 (> 1.3917)

  • 6
  • 4
  • 2

2 4 6

  • 1.5
  • 1
  • 0.5

0.5 1 1.5

slide-40
SLIDE 40

Intro Bisection Newton Systems Optimization Software Summary

Convergence

No guarantee of convergence (unlike bisection). For example, f(x) = atan(x), x+ = xc − (1 + x2

c )atan(xc)

x1 = −1.6941

  • 6
  • 4
  • 2

2 4 6

  • 1.5
  • 1
  • 0.5

0.5 1 1.5

slide-41
SLIDE 41

Intro Bisection Newton Systems Optimization Software Summary

Convergence

No guarantee of convergence (unlike bisection). For example, f(x) = atan(x), x+ = xc − (1 + x2

c )atan(xc)

x2 = 2.3211

  • 6
  • 4
  • 2

2 4 6

  • 1.5
  • 1
  • 0.5

0.5 1 1.5

slide-42
SLIDE 42

Intro Bisection Newton Systems Optimization Software Summary

Convergence

No guarantee of convergence (unlike bisection). For example, f(x) = atan(x), x+ = xc − (1 + x2

c )atan(xc)

x3 = −5.1141

  • 6
  • 4
  • 2

2 4 6

  • 1.5
  • 1
  • 0.5

0.5 1 1.5

slide-43
SLIDE 43

Intro Bisection Newton Systems Optimization Software Summary

Convergence

f(x) = atan(x), x+ = xc − (1 + x2

c )atan(xc)

x0 = −1.3 (| − 1.3| < 1.3917)

  • 2
  • 1.5
  • 1
  • 0.5

0.5 1 1.5 2

  • 1.5
  • 1
  • 0.5

0.5 1 1.5

slide-44
SLIDE 44

Intro Bisection Newton Systems Optimization Software Summary

Convergence

f(x) = atan(x), x+ = xc − (1 + x2

c )atan(xc)

x1 = 1.1616

  • 2
  • 1.5
  • 1
  • 0.5

0.5 1 1.5 2

  • 1.5
  • 1
  • 0.5

0.5 1 1.5

slide-45
SLIDE 45

Intro Bisection Newton Systems Optimization Software Summary

Convergence

f(x) = atan(x), x+ = xc − (1 + x2

c )atan(xc)

x1 = −0.8589

  • 2
  • 1.5
  • 1
  • 0.5

0.5 1 1.5 2

  • 1.5
  • 1
  • 0.5

0.5 1 1.5

slide-46
SLIDE 46

Intro Bisection Newton Systems Optimization Software Summary

Convergence

f(x) = atan(x), x+ = xc − (1 + x2

c )atan(xc)

x1 = 0.3742

  • 2
  • 1.5
  • 1
  • 0.5

0.5 1 1.5 2

  • 1.5
  • 1
  • 0.5

0.5 1 1.5

slide-47
SLIDE 47

Intro Bisection Newton Systems Optimization Software Summary

Convergence

Conditions for convergence (qualitative): x0 close enough to x∗ f ′(x) does not change sign near x∗ f(x) is not too nonlinear near x∗ Newton’s method is a local method.

slide-48
SLIDE 48

Intro Bisection Newton Systems Optimization Software Summary

Convergence

Conditions for convergence (qualitative): x0 close enough to x∗ f ′(x) does not change sign near x∗ f(x) is not too nonlinear near x∗ Newton’s method is a local method. Difficulty: Finding x0.

slide-49
SLIDE 49

Intro Bisection Newton Systems Optimization Software Summary

Hybrid methods

Combining bisection and Newton’s methods Bracketing interval [a, b], and xc = a or b if x+ = xc − f(xc)/f ′(xc) ∈ [a, b] bracketing interval [a, x+] or [x+, b] else m = (a + b)/2; bracketing interval [a, m] or [m, b] Termination criteria: Any one of (bk − ak) < δ |f(xc)| < δ Too many function evaluations

slide-50
SLIDE 50

Intro Bisection Newton Systems Optimization Software Summary

Avoiding derivatives

Approximation f ′(xc) ≈ f(xc + δc) − f(xc) δc

slide-51
SLIDE 51

Intro Bisection Newton Systems Optimization Software Summary

Avoiding derivatives

Approximation f ′(xc) ≈ f(xc + δc) − f(xc) δc Choice of δc

  • Example. Secant method (δc = x− − xc)

f ′(xc) ≈ f(xc) − f(x−) xc − x−

slide-52
SLIDE 52

Intro Bisection Newton Systems Optimization Software Summary

Secant method

x+ = xc − xc − x− f(xc) − f(x−)f(xc)

0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2

  • 1

1 2 3 4 5 6 7

slide-53
SLIDE 53

Intro Bisection Newton Systems Optimization Software Summary

Secant method

x+ = xc − xc − x− f(xc) − f(x−)f(xc)

0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2

  • 1

1 2 3 4 5 6 7

Usually, the convergence rate (if it converges) is (1 + √ 5)/2 ≈ 1.6 ek+1 ≤ ce1.6

k , superlinear, between quadratic and linear.

slide-54
SLIDE 54

Intro Bisection Newton Systems Optimization Software Summary

Zeros of a polynomial

Finding the zeros of a polynomial p = xn + cn−1xn−1 + ... + c1x1 + c0 Many methods were proposed.

slide-55
SLIDE 55

Intro Bisection Newton Systems Optimization Software Summary

Zeros of a polynomial

Finding the zeros of a polynomial p = xn + cn−1xn−1 + ... + c1x1 + c0 Many methods were proposed. The eigenvalues of its companion matrix C(p) =        · · · −c0 1 · · · −c1 1 · · · −c2 . . . . . . . . . . . . . . . · · · 1 −cn−1        , det(xI − C(p)) = p

slide-56
SLIDE 56

Intro Bisection Newton Systems Optimization Software Summary

Example

The zeros of the polynomial x3 − 1 are the eigenvalues of   1 1 1  

slide-57
SLIDE 57

Intro Bisection Newton Systems Optimization Software Summary

Example

The zeros of the polynomial x3 − 1 are the eigenvalues of   1 1 1   One real and two complex conjugate eigenvalues.

slide-58
SLIDE 58

Intro Bisection Newton Systems Optimization Software Summary

Note

How to compute the eigenvalues of a matrix? Finding the zeros of a polynomial used to be the way of finding the eigenvalues of a matrix A.

slide-59
SLIDE 59

Intro Bisection Newton Systems Optimization Software Summary

Note

How to compute the eigenvalues of a matrix? Finding the zeros of a polynomial used to be the way of finding the eigenvalues of a matrix A. Text book method: The eigenvalues of a matrix A are the zeros of its characteristic polynomial det(λI − A).

slide-60
SLIDE 60

Intro Bisection Newton Systems Optimization Software Summary

Note

Now, we have efficient and reliable methods for computing eigenvalues of a matrix. QR method, John G.F. Francis and Vera N. Kublanovskaya, late 1950s. We find the zeros of a polynomial by computing the eigenvalues of its companion matrix.

slide-61
SLIDE 61

Intro Bisection Newton Systems Optimization Software Summary

Outline

1

Introduction

2

Bisection Method

3

Newton’s Method

4

Systems of Nonlinear Equations

5

Continuous Optimization

6

Software Packages

slide-62
SLIDE 62

Intro Bisection Newton Systems Optimization Software Summary

Problem setting

f1(x1, ..., xn) = f2(x1, ..., xn) = . . . fn(x1, ..., xn) = Denote f(x) = 0 f: vector-valued function x: vector

slide-63
SLIDE 63

Intro Bisection Newton Systems Optimization Software Summary

Newton’s method x+ = xc + sc where sc is the solution of f(xc) + J(xc)sc = 0, i.e., sc = −J−1(xc)f(xc), where J(xc) is the Jacobian of f at xc: J(x) = ∂fi ∂xj

  • =

  

∂f1 ∂x1

· · ·

∂f1 ∂xn

. . . . . . . . .

∂fn ∂x1

· · ·

∂fn ∂xn

  

slide-64
SLIDE 64

Intro Bisection Newton Systems Optimization Software Summary

Example

A system of nonlinear equations x2

1 − x2 2 = 0

2x1x2 = 1, with starting point x0 = 1

  • Solution: x1 = x2 = 1/

√ 2

slide-65
SLIDE 65

Intro Bisection Newton Systems Optimization Software Summary

Example

f(x) = f1 f2

  • =
  • x2

1 − x2 2

2x1x2 − 1

  • The Jacobian is

J(x) = 2x1 −2x2 2x2 2x1

  • and

J(x0) = −2 2

slide-66
SLIDE 66

Intro Bisection Newton Systems Optimization Software Summary

Example

Step 1: x1 = x0 − J−1(x0) f(x0)

slide-67
SLIDE 67

Intro Bisection Newton Systems Optimization Software Summary

Example

Step 1: x1 = x0 − J−1(x0) f(x0) Solving for d0 in J(x0)d0 = f(x0), we have d0 = −0.5 0.5

  • Thus

x1 = 1

−0.5 0.5

  • =

0.5 0.5

slide-68
SLIDE 68

Intro Bisection Newton Systems Optimization Software Summary

Example

Step 2: x2 = x1 − J−1(x1) f(x1) J(x1) = 1 −1 1 1

  • ,

f(x1) =

  • −0.5
slide-69
SLIDE 69

Intro Bisection Newton Systems Optimization Software Summary

Example

Solving for d1 in J(x1)d1 = f(x1), we have d1 = −0.25 −0.25

  • Thus

x2 = 0.5 0.5

−0.25 −0.25

  • =

0.75 0.75

slide-70
SLIDE 70

Intro Bisection Newton Systems Optimization Software Summary

Avoiding derivatives

The jth column of J(x)    

∂f1 ∂xj

. . .

∂fn ∂xj

    = ∂f ∂xj can be approximated by the difference f(x1, ..., xj + δ, xj+1, ..., xn) − f(x1, ..., xn) δ

slide-71
SLIDE 71

Intro Bisection Newton Systems Optimization Software Summary

Outline

1

Introduction

2

Bisection Method

3

Newton’s Method

4

Systems of Nonlinear Equations

5

Continuous Optimization

6

Software Packages

slide-72
SLIDE 72

Intro Bisection Newton Systems Optimization Software Summary

Problem setting

min

x∈S f(x)

  • r

max

x∈S f(x)

x: vector f(x): objective function and real-valued S: support

slide-73
SLIDE 73

Intro Bisection Newton Systems Optimization Software Summary

Problem setting

min

x∈S f(x)

  • r

max

x∈S f(x)

x: vector f(x): objective function and real-valued S: support Find a zero of the gradient ∇f(x) =    

∂f(x) ∂x1

. . .

∂f(x) ∂xn

   

slide-74
SLIDE 74

Intro Bisection Newton Systems Optimization Software Summary

Newton’s method

View the gradient ∇f(x) as a vector-valued function and apply the Newton’s method for solving nonlinear systems. At current xc, find the correction sc: x+ = xc + sc where sc is the solution of ∇f(xc) + H(xc)Sc = 0. The matrix H(xc) (the Jacobian of the gradient at xc) is called the Hessian of f at xc (∇2f(xc)): Hi,j = ∂2f ∂xi ∂xj .

slide-75
SLIDE 75

Intro Bisection Newton Systems Optimization Software Summary

Example

Minimizing f : R2 → R: f(x) = x3

1

3 − x1x2

2 + x2.

Perform one iteration of Newton’s method for minimizing f using the starting point x0 = 1

  • .
slide-76
SLIDE 76

Intro Bisection Newton Systems Optimization Software Summary

Example

Apply the Newton’s method for finding a zero of the gradient ∇f(x) =

  • x2

1 − x2 2

−2x1x2 + 1

  • The Hessian

H(x) = ∇2f(x) =

  • 2x1

−2x2 −2x2 −2x1

slide-77
SLIDE 77

Intro Bisection Newton Systems Optimization Software Summary

Example

Step 1: x1 = x0 − ∇2f(x0)−1 ∇f(x0) = 1

  • −1/2

−1/2 −1 1

  • =

1/2 1/2

slide-78
SLIDE 78

Intro Bisection Newton Systems Optimization Software Summary

Outline

1

Introduction

2

Bisection Method

3

Newton’s Method

4

Systems of Nonlinear Equations

5

Continuous Optimization

6

Software Packages

slide-79
SLIDE 79

Intro Bisection Newton Systems Optimization Software Summary

Software Packages

IMSL zporc, zplrc, zpocc MATLAB roots, fzero NAG c02agf, c02aff NAPACK czero Octave fsolve

slide-80
SLIDE 80

Intro Bisection Newton Systems Optimization Software Summary

Summary

Issues in an iterative method: Initialization, convergence and rate of convergence, termination. The example of computing square root Bisection method: Numerical termination problem Newton’s method: Initial value, convergence problems Newton’s method for systems of nonlinear equations, Jacobian matrix Newton’s method for minimization, gradient and Hessian.

slide-81
SLIDE 81

Intro Bisection Newton Systems Optimization Software Summary

References

[1 ] George E. Forsyth and Michael A. Malcolm and Cleve B. Moler. Computer Methods for Mathematical Computations. Prentice-Hall, Inc., 1977. Ch 7.