computational optimization
play

Computational Optimization Convergence Rates Lecture 3 1/24/08 - PowerPoint PPT Presentation

Computational Optimization Convergence Rates Lecture 3 1/24/08 Golden Section Search Basic idea: Economize the number of function evaluations by trapping a min solutions in a set of nested intervals. Begin by evaluating: f(a o ), f(left o


  1. Computational Optimization Convergence Rates Lecture 3 1/24/08

  2. Golden Section Search Basic idea: Economize the number of function evaluations by trapping a min solutions in a set of nested intervals. Begin by evaluating: f(a o ), f(left o ), f(right o ), f(b o ). Where a o < left o < right o < b o.

  3. Which interval contains x*? f(left) < f(right) → [a,right] a left right b f(left) > f(right) → [left,b] a left right b f(left) = f(right) → [left,right] a left right b

  4. Choice of τ Iteration 0 Iteration 1 a 0 l 0 r 0 b 0 a 1 l 1 r 1 b 1 b 0 a 0 l 0 r 0 = a 1 + τ (b 1 -a 1 ) •Want l 0 = r 1 = a 0 + τ (r 0 -a 1 ) = a 0 + τ (a 0 + τ (b 0 -a 0 ) –a 0 ) = a 0 + τ 2 (b 0 -a 0 )

  5. Choice of τ Continued Have: l 0 = b 0 - τ (b 0 -a 0 )= a 0 + τ 2 (b 0 -a 0 ) Therefore 0=( τ 2 + τ -1)(b 0 -a 0 ) Which implies τ − + 1 5 .6 = ≈ 1 8 2 τ is Golden Section

  6. Golden Section Search Alg. a o , b o ,f,tol, k=0 1. r 0 = a 0 + τ (b 0 -a 0 ), l 0 = b 0 - τ (b 0 -a 0 ) 2. fl=f(l k ),fr=f(r k ), fa=f(a 0 ),fb=f(b 0 ), 3. while [a k , b k ] > tol 4. If f(l k ) < f(r k ) then a k+1 = a k , b k+1 =r k 1. r k+1 =l k, l k+1 = b k+1 - τ (b k+1 -a k+1 ), fa=fa, fb =fr, fr=fl, fl=f( l k+1 ) else if f(l k ) >= f(r k ) then a k+1 = l k , b k+1 =b k 2. l k+1 = r k , r k+1 =a k+1 + τ (b k+1 -a k+1 ), fa=fl, fb =fb, fl=fr, fr=f( r k+1 ) k=k+1 3. 5. x*= (a k +b k )/2

  7. Challenges Define an algorithm to optimize problem Does the algorithm converge? What does it converge to? Under what assumptions is the algorithm valid? How fast does the algorithm work in theory? How well does the algorithm work in practice?

  8. Analysis of GS Search The length of the interval of uncertainty,b k – a k, is reduced by τ at each interval. Length of interval of uncertainty. (b 1 – a 1 )= τ (b 0 – a 0 ). (b 2 – a 2 )= τ (b 1 – a 1 )= τ 2 (b 0 – a 0 ). …. (b k – a k )= τ (b k-1 – a k-1 )= τ k (b 0 – a 0 ).

  9. What Is the Quality of the Solution? Program halts when (a k - b k )<acc=tol(b 0 – a 0 ) We know |x k -x*|<acc/2. What other measures are there of the. Quality of a solution? � |f(x k )-f(x*)|. � |f’(x k )| if differentiable since f’(x*)=0 at min for diff. funct.

  10. Error Analysis The midpoint x k = (a k + b k )/2 of the interval of uncertainty is the best current estimate of x*. Why? Error(0) = |x 0 – x*| = (b 0 – a 0 )/2. Error(1) = |x 1 – x*| = (b 1 – a 1 )/2= τ (b 0 – a 0 )/2. …… Error(k) = |x k – x*| = (b k – a k )/2= τ k (b 0 – a 0 )/2. Error reduces by τ each iteration. Called linear or geometric convergence.

  11. Convergence Analysis Interval shrinks by τ each iteration. So algorithm must terminate. Algorithm halts when (b k – a k )<acc. For acc = tol(b 0 – a 0 ), tol(b 0 – a 0 ) ≥ (b k – a k )= τ k (b 0 – a 0 ), tol>= τ K implies log(tol) ≥ k log( τ ), So halts when k >= ceil(log(tol)/log( τ )).

  12. How Efficient? For acc = tol(b 0 – a 0 ), 1/log( τ )=-4.784. Halts when k = ceil(-log(tol)4.784). Dominate computation cost is function evaluations. Requires 4 + k function evaluations. Does golden section search perform like this in practice?

  13. Rates of Convergence Local analysis: How does the algorithm behave near solution? Assume x k → x* f(x k ) → f(x*) or e k =|| x k -x*|| or e k =| f(x k )-f(x*)| In GS search e k+1 ≤ τ e k = τ k e o

  14. Linear Convergence See page 619-621 in MW A sequence converges linearly or geometrically if ∃ q>0 and β∈[ 0,1 ] such that e k ≤ q β k for k sufficiently large or equivalently e + ≤ β k 1 lim → ∞ e k k

  15. General Convergence We say {x k } converges to x* with rate r and rate constant c if e = + k 1 li m c r → ∞ ( e ) k k If r=1 0<c<1 then linear rate. If (r=1 and c=0) or (1<r<2) then superlinear rate. If r=2 then quadratic rate.

  16. Sample convergence rates Linear 1, .1, .01, .001, .0001, .00001,… Quadratic 1, .1, .001, 1e-9,1e-27,….

  17. How fast does it converge? Let x k+1 =(x k +1)/2 (implies x*=1) Define e k+1 =|x k+1 -1|=|(x k +1)/2-1| =|(x k -1)/2| − ( x 1) / 2 e 1 = = = + k 1 k lim 1 Thus for r r − r → 2 k 0 e x 1 k k So linear convergence Doesn’t work for r>1, so not superlinear.

  18. You try Consider = k ∈ 2 (0,1) x a for a k What does {x k } converge to? What is the convergence rate?

  19. Quadratic Convergence = k 2 e a k + = k 1 2 e a + k 1 + k 1 2 e + a ⎡ ⎤ [ ] − k 1 k − k 2 r 2 = = = <∞ + ⎣ ⎦ 2 2 r k 1 lim a a r r →∞ ⎡ ⎤ k e k 2 a k ⎣ ⎦ − ≥ ≥ provided 2 r 0 or 2 r .

  20. Matlab Practice Please do tutorial: http://www.math.ufl.edu/help/matlab- tutorial/matlab-tutorial.html

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend