Solving minimax problems with feasible sequential quadratic - - PowerPoint PPT Presentation

solving minimax problems with feasible sequential
SMART_READER_LITE
LIVE PREVIEW

Solving minimax problems with feasible sequential quadratic - - PowerPoint PPT Presentation

Qianli Deng (Shally) E-mail: dqianli@umd.edu Advisor: Dr. Mark A. Austin Department of Civil and Environmental Engineering Institute for Systems Research University of Maryland, College Park, MD 20742 E-mail: austin@isr.umd.edu Solving


slide-1
SLIDE 1

Solving minimax problems with feasible sequential quadratic programming

12/13/2013 Qianli Deng (Shally) E-mail: dqianli@umd.edu Advisor: Dr. Mark A. Austin Department of Civil and Environmental Engineering Institute for Systems Research University of Maryland, College Park, MD 20742 E-mail: austin@isr.umd.edu

slide-2
SLIDE 2

Background

Feasible sequential quadratic programming (FSQP) refers to a class of

sequential quadratic programming methods.

Engineering applications: number of variables is not large; evaluations of

  • bjective or constraint functions and of the gradients are time consuming

Advantages: 1) Generating feasible iterates. 2) Reducing the amount of computation. 3) Enjoying the same global and fast local convergence properties.

slide-3
SLIDE 3

Background

The constrained mini-max problem * Bounds * Nonlinear inequality * Linear inequality * Nonlinear equality * Linear equality

( ) 0, 1,..., ( ) , 0, 1,..., ( ) 0, 1,..., ( ) , 0, 1,...,

i i e e

j i j j n j n i i j e j j n j n e e

bl x bu g x j n g x c x d j n t h x j n h x a x b j n t

− − − −

≤ ≤   ≤ =   ≡ − ≤ = +   = =   ≡ − = = + 

slide-4
SLIDE 4

Algorithm - FSQP

* Step 1. Initialization * Step 2. A search arc * Step 3. Arc search * Step 4. Updates

slide-5
SLIDE 5

Initialization (x, H, p, k)

Initial guess x00

  • Positive penalty parameters

{ }

1

( , ) max ( ) ( )

e f

n m i j j i I j

f x p f x p h x

∈ =

= −∑

slide-6
SLIDE 6

Computation of a search arc

slide-7
SLIDE 7

Computation of a search arc

1 min , '( , , ) 2 . . ( ) ( ), 0, 1,..., 1,..., ( ) ( ), 0, 1,..., ,

k k k d k j k j k i e j k j k e e j k j

d H d f x d p bl x d bu st g x g x d j t j n h x h x d j t n a x d b  +   ≤ + ≤    + ∇ ≤ =   = + ∇ ≤  = −  + = 

1

1 1 , 1 1 1 1 1 1

min , 2 . . '( , , ) ( ) ( ), 1,..., , 1,..., ( ) ( ), 1,..., , 1,...,

n

k k d k k k j k j k i j k j i i j k j k e j k j e e

d d d d s t bl x d bu f x d p g x g x d j n c x d d j t n h x h x d j n a x d b j t n

γ

η γ γ γ γ

∈ ∈

 − − +   ≤ + ≤   ≤   + ∇ ≤ =   + ≤ = −   + ∇ ≤ =   + = = −  

ℝ ℝ

|| || /(|| || )

k k k k

d d v

κ κ

ρ = +

1 1

max(0.5,|| || )

k k

v d

τ

=

slide-8
SLIDE 8

Quadratic programming

Strictly convex quadratic programming:

  • Unique global minimum
  • Matrix C need to be positive definite

m = number of constraints; n = number of variables

1 1

; ; ;

m n m n n n

A B C D

× × × × 1 1 2

[ , ,..., ]'

n n

X x x x

× =

Extended Wolfe’s simplex method

  • No derivative

Reference: Wolfe, P. (1959). The simplex method for quadratic programming. Econometrica: Journal of the Econometric Society, 382-398.

slide-9
SLIDE 9

Quadratic programming

Lagrangian function: Karush-Kuhn-Tucker conditions:

' ' 0; ' ' ( ) ' '( ' ') L AX B L X C D A CX D A X L AX B L X X CX D A X λ λ λ λ λ λ λ ∂  = − ≤  ∂  ∂  = + + ≥ + + ≥ ∂  ∂  = − =  ∂  ∂  = + + = ∂ 

' ' ' 0, 0, 0, 0, AX B CX A s D X X s υ λ µ λυ µ λ µ υ + =   + − + = −   =   =  ≥ ≥ ≥ ≥ ≥  

= slack variables; = surplus variables s = artificial variables (, ) and (’ , X) = complementary slack variables

slide-10
SLIDE 10

Linear programming

Quadratic programming: => Conditioned linear programming: Simplex tableau:

are basis variables

slide-11
SLIDE 11

Test example

slide-12
SLIDE 12

Arc search

'( , , )

k k k k

f x d p δ =

slide-13
SLIDE 13

Updates (x, H, p, k)

  • 2

1 k k k k k k

x x t d t d

+ =

+ + ɶ

, , 1 1, 1 ,

max{ , }

k j k j j k j j k j

p p p p

  • therwise

µ ε ε µ δ

+

+ ≥  =  − 

1 k k = +

slide-14
SLIDE 14

Updates

  • 1

k

H +

is the Hessian of the Lagrangian function

( , )

m k k

f x p

1 1 1 1

( , ) ( , )

k k k k x m k k x m k k

x x f x p f x p η γ

+ + + +

= − = ∇ −∇ BFGS formula with Powell’s modification:

1 1 1 1 1

(1 )

k k k k k k

H ξ θ γ θ δ

+ + + + +

= ⋅ + − ⋅

1 1 1 1 1 1 1 1 1 T T k k k k k k k k T T k k k k k

H H H H H η η ξ ξ η η η ξ

+ + + + + + + + +

= − +

1 1 1 1 1 1 1 1 1 1 1

1, 0.2 0.8

T T k k k k k T k k k k T T k k k k k

H H

  • therwise

H η γ η η θ η η η η η γ

+ + + + + + + + + + +

 ≥  =   − 

slide-15
SLIDE 15

Project Schedule

  • October
  • Literature review;
  • Specify the implementation module details;
  • Structure the implementation;

November

  • Develop the quadratic programming module;
  • Unconstrained quadratic program;
  • Strictly convex quadratic program;
  • Validate the quadratic programming module;

December

  • Develop the Gradient and Hessian matrix calculation module;
  • Validate the Gradient and Hessian matrix calculation module;
  • Midterm project report and presentation;
slide-16
SLIDE 16
  • January
  • Develop Armijo line search module;
  • Validate Armijo line search module;

February

  • Develop the feasible initial point module;
  • Validate the feasible initial point module;
  • Integrate the program;

March

  • Debug and document the program;
  • Validate and test the program with case application;

April

  • Add arch search variable

in;

  • Compare calculation efficiency of line search with arch search methods;

May

  • Develop the user interface if time available;
  • Final project report and presentation;

d ɶ

slide-17
SLIDE 17

Bibliography