Computational Optimization Constrained Optimization m R b , m - - PowerPoint PPT Presentation

computational optimization
SMART_READER_LITE
LIVE PREVIEW

Computational Optimization Constrained Optimization m R b , m - - PowerPoint PPT Presentation

Computational Optimization Constrained Optimization m R b , m n n Easiest Problem R R Linear equality constraints f A b ( ) = f x Ax min . . s t Null Space Representation Let x* be a feasible point, Ax*=b.


slide-1
SLIDE 1

Computational Optimization

Constrained Optimization

slide-2
SLIDE 2

Easiest Problem

Linear equality constraints

min ( ) . . ,

n m n m

f x f R s t Ax b A R b R

×

∈ = ∈ ∈

slide-3
SLIDE 3

Null Space Representation

Let x* be a feasible point, Ax*=b. Any other feasible point can be written as x=x*+P where Ap=0 The feasible region {x : x*+p p∈N(A)} where N(A) is null space of A

slide-4
SLIDE 4

Example

Solve by substitution becomes

( )

1 2 2 3 1 2 2 2 1 2 3

min . . 3 4 4 x x x s t x x x + + + + =

( )

1 2 2 3 1 2 2 2 1 2 3

min . . 4 3 4 x x x s t x x x + + = − −

( )

( )

3

2 1 2 3 2 3 2 2 2

min 4 3 4 x x x x − − + +

slide-5
SLIDE 5

Null Space Method

x*= [4 0 0]’ x=x*+v becomes

( )

1 2 2 3 1 2 2 2 1 2 3

min . . 3 4 4 x x x s t x x x + + + + =

( )

( )

3

2 1 2 3 1 2 1 3 2

min 4 3 4 v v v v − − + +

3 4 1 1 Z − − ⎡ ⎤ ⎢ ⎥ = ⎢ ⎥ ⎢ ⎥ ⎣ ⎦

1 1 1 1 2 2

4 3 4 4 3 4 1 1 v v v v v v − − − − ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ + = ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦

slide-6
SLIDE 6

Variable Reduction Method

Let A=[B N] for x* (a basic feasible solution with at most m nonzero variables corresponding to columns of B) assumes m < n is a basis matrix for null space of A

[ ]

1 1 r r

B B A AA B N I

− −

⎡ ⎤ ⎡ ⎤ − = ⇒ = = + ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦

1

B N Z I

⎡ ⎤ − = ⎢ ⎥ ⎣ ⎦

( ) ( ) ( ) m n m m m n m n m n m

A R B R N R I R

× × × − − × −

∈ ∈ ∈ ∈

slide-7
SLIDE 7

Where did Z come from?

A=[1 3 4] x* = [4 0 0] A=[B N] B=[1] N = [3 4]

1

3 4 1*[3 4] 1 1 1 1 B N Z I

− − − ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ − ⎢ ⎥ ⎢ ⎥ = = = ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦

slide-8
SLIDE 8

General Method

There exists a Null Space Matrix The feasible region is: Equivalent “Reduced” Problem

n r

Z R r n m

×

∈ ≥ −

{ }

| * x x Zv +

min ( * )

v f x

Zv +

slide-9
SLIDE 9

Practice Problem

( )

1 2 2 2 1 2 3 2 1 2 3 2 3

min . . 3 4 4 x x x s t x x x x x + + + + = − =

slide-10
SLIDE 10

Optimality Conditions

Assume feasible point and convert to null space formulation

2 2 2

( ) ( * ) ( ) ' ( * ) ' ( ) * ( ) ' ( * ) ' ( ) g v f x Zv g v Z f x Zv Z f y where z x Zv g v Z f x Zv Z Z f y Z = + ∇ = ∇ + = ∇ = = + ∇ = ∇ + = ∇

slide-11
SLIDE 11

Lemma 14.1 Necessary Conditions (Nash + Sofer)

If x* is a local min of f over {x|Ax=b}, and Z is a null matrix Or equivalently use KKT Conditions

2

' ( *) ' ( *) . . . Z f x and Z f x Z is p s d ⇒ ∇ = ∇

2

( *) ' * ' ( *) . . . f x A has a solution Ax b Z f x Z is p s d λ ⇒ ∇ − = = ⇒ ∇

slide-12
SLIDE 12

Lemma 14.2 Sufficient Conditions (Nash + Sofer)

If x* satisfies (where Z is a basis matrix for Null(A)) then x* is a strict local minimizer

2

* ' ( *) ' ( *) . . Ax b Z f x Z f x Z is p d = ∇ = ∇

slide-13
SLIDE 13

Lemma 14.2 Sufficient Conditions (KKT form)

If (x*,λ*) satisfies (where Z is a basis matrix for Null(A)) then x* is a strict local minimizer

2

* ( *) ' ' ( *) . . Ax b f x A Z f x Z is p d λ = ∇ − = ∇

slide-14
SLIDE 14

Lagrangian Multiplier

λ* is called the Lagrangian Multiplier It represents the sensitivity of solution to small perturbations of constraints

* 1

ˆ ˆ ( ) ( *) ( *)' ( *) ˆ ( *) ( *)' ' * ˆ ( *) ' * ( *)

m i i i

f x f x x x f x f x x x A by KKT OC Nowlet Ax b f x f x λ δ δ λ δ λ

=

≈ + − ∇ = + − = + = + = +∑

slide-15
SLIDE 15

Optimality conditions

Consider min (x2+4y2)/2 s.t. x-y=10

( ) ' 1 4 1 10 * * 8, * 2, f x A Ax b x y x y x y λ λ λ ∇ − = = ⎡ ⎤ ⎡ ⎤ = ⎢ ⎥ ⎢ ⎥ − ⎣ ⎦ ⎣ ⎦ − = = = =

slide-16
SLIDE 16

Optimality conditions

Find KKT point Check SOSC

( ) ' 1 4 1 10 * * 8, * 2, f x A Ax b x y x y x y λ λ λ ∇ − = = ⎡ ⎤ ⎡ ⎤ = ⎢ ⎥ ⎢ ⎥ − ⎣ ⎦ ⎣ ⎦ − = = = = −

  • 2

2

' [1 1] 1 ( ) 4 ' ( ) . . So SOSC satisfied Or we could just observe that it is a convex program so FONC are sufficient Z f x Z f x Z is p d = ⎡ ⎤ ∇ = ⎢ ⎥ ⎣ ⎦ ∇

slide-17
SLIDE 17

( )

[ ]

10 1 1 4 1 1 A 4x x f(x) b Ax ) ( 10 s.t. 4 2 1 min

2 1 2 1 2 1 2 1 2 2 2 1

= − ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ − = ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ ∴ − = ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ = ∇ = = ∇ = − + x x x x A x f x x x x

T

λ λ

Linear Equality Constraints - I

slide-18
SLIDE 18

Linear Equality Constraints - II

point KKT 8 * , 2 8 * 8 2 8 10 5 10 4 4 4 x : Solve

2 1 2 2 2 2 1 1 2 1

← = ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ − = = − = = = − ⇒ = − − − = ⇒ − = ⇒ = λ λ λ x x x x x x x x x x

slide-19
SLIDE 19

[ ] [ ]

1 1 4 1 1 1 ) ( 4 1 ) ( 1 1 Z 1

  • 1

A SOSC

2 2

> ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ = ∇ ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ = ∇ ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ = = Z x f Z x f

T

so SOSC satisfied, and x* is a strict local minimum Objective is convex, so KKT conditions are sufficient.

Linear Equality Constraints - III

slide-20
SLIDE 20

Handy ways to compute Null Space

Variable Reduction Method Orthogonal Projection Matrix QR factorization (best numerically) Z=Null(A) in matlab

slide-21
SLIDE 21

Orthogonal Projection Method

Use optimization. Minimize distance between given point c and null space of A.

2

1 min 2 . .

p

p c s t Ap − =

( *) ' * ( * ) ' f p A Ap

  • r equivalently

p c A * Ap λ λ ∇ = = − = =

slide-22
SLIDE 22

Orthogonal Projection Method

Optimality conditions give us the solution

( ) ( ) ( )

1 1 1

( * ) ' * * ' ' * ' ' ' ( ' ' ) FONC is p c A Ap Ap Ac AA AA Ac p A c A AA Ac c I A AA A c λ λ λ λ

− − −

− = = ⇒ − = ⇒ = − ⇒ = + = − + = −

slide-23
SLIDE 23

Orthogonal Projection Method

Final result is: Note null space matrix is not unique

( )

1

( ' ' ) I A AA A Null Matrices of A

− ∈

Try it in Matlab for A= [1 3 5; 2 4 -1] Compare with Null(A) Null(‘A’,r)

slide-24
SLIDE 24

Get Lagrangian Multipliers for free!

The matrix is the right inverse matrix for A. For general problems

( ) ( )

1 1

' ' ' '

r r

A A AA where AA AA AA I

− −

= = =

'

min ( ) . . * ( *)

r

f x s t Ax b A f x λ = = ∇

slide-25
SLIDE 25

Let’s try it

For Projection matrix

1 2 3 4

1 2 2 2 2 2 1 2 3 4

min ( ) . . 1 f x x x x x s t x x x x ⎡ ⎤ = + + + ⎣ ⎦ + + + =

( ) [ ] [ ]

1 1 3 1 1 1 4 4 4 4 1 3 1 1 4 4 4 4 1 1 3 1 4 4 4 4 1 1 1 3 4 4 4 4

1 1 1 1 1 1 ' ' 1 1 1 1 1 1 1 1 1 1 1 1 1 1 Z I A AA A

− − − − − − − − − − − − − −

⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ = − = − ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦ ⎡ ⎤ ⎢ ⎥ ⎢ ⎥ = ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦

slide-26
SLIDE 26

Solve FONC for Optimal Point

FONC

1 2 4

1 2 3 4

1 1 ( ) ' 1 1 1 x x f x A x x x x x x λ λ ⎡ ⎤ ⎡ ⎤ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ∇ − = = ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎢ ⎥ ⎣ ⎦ + + + =

slide-27
SLIDE 27

Check Optimality Conditions

For Using Lagrangian

3 1 1 1 4 4 4 4 1 3 1 1 4 4 4 4 1 1 3 1 4 4 4 4 1 1 1 3 4 4 4 4

1 1 1 ( *) * 1 4 1 Z f x

− − − − − − − − − − − −

⎡ ⎤ ⎡ ⎤ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ∇ = = ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦

* [1111]/ 4 ( *) [1111]/ 4 * x f x Ax b = ∇ = =

( )

1

1 ' ' [1111]' 4 ( *) 1/ 4 ( *) '

r r

A A AA A f x Clearly f x A λ λ

= = = ∇ = ∇ =

slide-28
SLIDE 28

You try it

For Find projection matrix Confirm optimality conds are Z’Cx*=0, Ax* = b Find x* Compute Lagrangian multipliers Check Lagrangian form of the multipliers.

1 2

min ( ) ' . . 13 6 3 13 23 9 3 2 1 2 1 2 6 9 12 1 1 1 3 1 3 3 3 1 3 f x x Cx s t Ax b C A b = = − − − ⎡ ⎤ ⎢ ⎥ − − ⎡ ⎤ ⎡ ⎤ ⎢ ⎥ = = = ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ − − − − ⎣ ⎦ ⎣ ⎦ ⎢ ⎥ − ⎢ ⎥ ⎣ ⎦

slide-29
SLIDE 29

Variable Reduction Method

Let A=[B N] A is m by n B is m by m assume m < n is a basis matrix for null space of A

[ ]

1 1 r r

B B A AA B N I

− −

⎡ ⎤ ⎡ ⎤ = ⇒ = = + ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦

1

B N Z I

⎡ ⎤ − = ⎢ ⎥ ⎣ ⎦

slide-30
SLIDE 30

Try on our example

Take for example first two columns for B Then Condition number of Z’ CZ = 158 better but not great

[ ]

2 1 2 1 2 1 2 1 1 1 3 1 1 1 3 1 A B N ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ = = = ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ − − ⎣ ⎦ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦

1 2 1 1 4 3 1 2 1 1

r

Z A − − ⎡ ⎤ ⎡ ⎤ ⎢ ⎥ ⎢ ⎥ − − ⎢ ⎥ ⎢ ⎥ = = ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦

slide-31
SLIDE 31

QR Factorization

Use Gram-Schmidt algorithm to make

  • rthogonal factorize A’=QR with Q
  • rthogonal and R upper triangular

[ ]

1 1 2 1 2 1 2 1 1

' , , ( ),

T r

R A QR Q Q where A m n Q n m Q n n m R m m Z Q A Q R− ⎡ ⎤ = = ⎢ ⎥ ⎣ ⎦ ∈ × ∈ × ∈ × − ∈ × = =

slide-32
SLIDE 32

QR on problem

Use matlab command QR [Q R ] = qr(A’) Q2 = Q(:,3:4) Cond(Q2’*C*Q2) = 9.79

slide-33
SLIDE 33

In Class Practice

Find optimal solution and verify FONC and SOSC of the following: Let the perimeter of a rectangle be fixed to 4. Find the shape of the rectangle with largest area. Solve the problem max x1x2+x2x3+x1x3 s.t. x1+x2+x3=3