solving minimax problems with feasible sequential
play

Solving minimax problems with feasible sequential quadratic - PowerPoint PPT Presentation

Qianli Deng (Shally) E-mail: dqianli@umd.edu Advisor: Dr. Mark A. Austin Department of Civil and Environmental Engineering Institute for Systems Research University of Maryland, College Park, MD 20742 E-mail: austin@isr.umd.edu Solving


  1. Qianli Deng (Shally) E-mail: dqianli@umd.edu Advisor: Dr. Mark A. Austin Department of Civil and Environmental Engineering Institute for Systems Research University of Maryland, College Park, MD 20742 E-mail: austin@isr.umd.edu Solving minimax problems with feasible sequential quadratic programming 12/13/2013

  2. Background � Feasible sequential quadratic programming (FSQP) refers to a class of sequential quadratic programming methods. � Engineering applications: number of variables is not large; evaluations of objective or constraint functions and of the gradients are time consuming � Advantages: � 1) Generating feasible iterates. � 2) Reducing the amount of computation. � 3) Enjoying the same global and fast local convergence properties. � ����������

  3. Background � The constrained mini-max problem  ≤ ≤ bl x bu * Bounds  ≤ = g ( ) x 0, j 1,..., n  * Nonlinear inequality j i  ≡ − ≤ = + g ( ) x c , x d 0, j n 1,..., t  * Linear inequality − − j j n j n i i i i  = = ( ) 0, 1,..., h x j n  * Nonlinear equality j e  ≡ − = = + ( ) , 0, 1,..., h x a x b j n t  * Linear equality − − j j n j n e e e e � ����������

  4. Algorithm - FSQP * Step 1. Initialization * Step 2. A search arc * Step 3. Arc search * Step 4. Updates � ����������

  5. Initialization ( x, H, p, k ) Initial guess x 00 n − ∑ e { } Positive penalty parameters = f ( , x p ) max f x ( ) p h ( ) x m i j j f ∈ i I = j 1 � ����������

  6. Computation of a search arc � ����������

  7. Computation of a search arc  η  1 0 1 0 1 − − + γ min d d d , d  0 0 0 + min , '( , , ) d H d f x d p  k k 1 ℝ n ℝ 2 ∈ γ ∈ d , k k k  0 2 d  1  ≤ + ≤ s t . . bl x d bu  0 ≤ + ≤ k bl x d bu   . . k st 1 ≤ γ f '( x d , , p )   k k 0  + ∇ ≤ g ( x ) g ( x ), d 0, = j 1,..., t  j k j k 1  + ∇ ≤ γ = g ( x ) g ( x ), d j 1,..., n i j k j k i   0 = + ∇ ≤ j 1,..., n h x ( ) h x ( ), d 0, e 1 j k j k  + ≤ = −  c x , d d j 1,..., t n j k j i i = − j 1,..., t n   0 + = a x , d b e e  1  + ∇ ≤ γ = ( ) ( ), 1,..., j k j h x h x d j n j k j k e  1  + = = − a x , d b j 1,..., t n  j k j e e 0 κ 0 κ ρ = + || d || /(|| d || v ) k k k k 1 τ 1 = v max(0.5,|| d || ) k k � ����������

  8. Quadratic programming Strictly convex quadratic programming: Unique global minimum • Matrix C need to be positive definite • m = number of constraints; n = number of variables × = X [ x x , ,..., x ]' n 1 1 2 n A ; B ; C ; D × × × × m n m 1 n n n 1 Extended Wolfe’s simplex method No derivative • � ���������� Reference: Wolfe, P. (1959). The simplex method for quadratic programming. Econometrica: Journal of the Econometric Society, 382-398.

  9. Quadratic programming Lagrangian function: Karush-Kuhn-Tucker conditions:  ∂ L  = − ≤ + υ = AX B 0 AX B  ∂ λ   ∂ + λ − µ + = − ' ' CX A s D   L  = + + λ ≥ + + λ ≥ X C ' D ' A 0; CX D A ' ' 0  ∂  λυ = X 0   ∂  L µ = ' X 0 λ = λ − = ( AX B ) 0   ∂ λ    ≥ λ ≥ µ ≥ υ ≥ ≥ X 0, 0, 0, 0, s 0 ∂ L  = + + λ = X ' X CX '( D A ' ') 0  ∂ X � = slack variables; � = surplus variables s = artificial variables ( � , � ) and ( � ’ , X ) = complementary slack variables � ����������

  10. Linear programming Quadratic programming: => Conditioned linear programming: Simplex tableau: � ��� � are basis variables �� ����������

  11. Test example �� ����������

  12. Arc search δ = '( , , ) f x d p k k k k �� ����������

  13. Updates ( x, H, p, k ) ɶ 2 + = + + x x t d t d 1 k k k k k k  + µ ≥ ε p p =  k j , k j , j 1 p + k 1, j ε − µ δ max{ , } p otherwise  1 j k j , = + k k 1 �� ����������

  14. Updates BFGS formula with Powell’s modification: f ( x , p ) H + is the Hessian of the Lagrangian function k 1 m k k η = − x x + + k 1 k 1 k γ = ∇ −∇ f ( x , p ) f ( x , p ) + + k 1 x m k 1 k x m k k  T T η γ ≥ η η 1, 0.2 H k + 1 k + 1 k + 1 k k + 1  =  ξ = θ ⋅ γ + − θ ⋅ δ (1 ) H θ T η η 0.8 H + + + + + k 1 k 1 k 1 k 1 k k 1 + k 1 + 1 + 1 k k k  otherwise T T  η η − η γ H + + + + k 1 k k 1 k 1 k 1 T T η η ξ ξ H H + + + + = − k k 1 k 1 k + k 1 k 1 H H + k 1 k T T η η η ξ H + + + + k 1 k k 1 k 1 k 1 �� ����������

  15. Project Schedule • Literature review; October • Specify the implementation module details; • Structure the implementation; • Develop the quadratic programming module; • Unconstrained quadratic program; November • Strictly convex quadratic program; • Validate the quadratic programming module; • Develop the Gradient and Hessian matrix calculation module; December • Validate the Gradient and Hessian matrix calculation module; • Midterm project report and presentation; �� ����������

  16. • Develop Armijo line search module; January • Validate Armijo line search module; • Develop the feasible initial point module; February • Validate the feasible initial point module; • Integrate the program; • Debug and document the program; March • Validate and test the program with case application; ɶ • Add arch search variable in; d April • Compare calculation efficiency of line search with arch search methods; • Develop the user interface if time available; May • Final project report and presentation; �� ����������

  17. Bibliography �� ����������

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend