Pointwise second-order necessary optimality conditions and - - PowerPoint PPT Presentation

pointwise second order necessary optimality conditions
SMART_READER_LITE
LIVE PREVIEW

Pointwise second-order necessary optimality conditions and - - PowerPoint PPT Presentation

Pointwise second-order necessary optimality conditions and sensitivity relations Nonlinear Partial Differential Equations and Applications Daniel Hoehener MIT joint work with H el` ene Frankowska Paris, June 20, 2016 O PTIMAL C ONTROL P


slide-1
SLIDE 1

Pointwise second-order necessary optimality conditions and sensitivity relations

Nonlinear Partial Differential Equations and Applications Daniel Hoehener

MIT

joint work with H´ el` ene Frankowska

Paris, June 20, 2016

slide-2
SLIDE 2

OPTIMAL CONTROL PROBLEM

Problem: Minimize ϕ(x(1))      ˙ x(t) = f(t, x(t), u(t)) ∀ t u(t) ∈ U(t) ∀ t x(0) ∈ K0.

slide-3
SLIDE 3

OPTIMAL CONTROL PROBLEM

Problem: Minimize ϕ(x(1))      ˙ x(t) = f(t, x(t), u(t)) ∀ t u(t) ∈ U(t) ∀ t x(0) ∈ K0. Assumptions:

◮ f satisfies standard

assumptions and is twice differentiable in x with Lipschitz derivative

◮ ϕ is differentiable ◮ K0 and U(t) are closed and

nonempty

slide-4
SLIDE 4

OPTIMAL CONTROL PROBLEM

Problem: Minimize ϕ(x(1))      ˙ x(t) = f(t, x(t), u(t)) ∀ t u(t) ∈ U(t) ∀ t x(0) ∈ K0. Assumptions:

◮ f satisfies standard

assumptions and is twice differentiable in x with Lipschitz derivative

◮ ϕ is differentiable ◮ K0 and U(t) are closed and

nonempty Value function V(t, x0) = inf {ϕ(x(1)) | x solution of problem starting at (t, x0)} .

slide-5
SLIDE 5

INTRODUCTION

Maximum Principle Let (¯ x, ¯ u) be an optimal solution. The solution ¯ p of

  • −˙

¯ p(t) = fx(t, ¯ x(t), ¯ u(t))T¯ p(t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies for all t i) ¯ p(0), k0 ≤ 0 ∀k0 ∈ T♭

K0(¯

x(0)) ii) ¯ p(t), f(t, ¯ x(t), ¯ u(t)) = maxu∈U(t) ¯ p(t), f(t, ¯ x(t), u).

slide-6
SLIDE 6

INTRODUCTION

Maximum Principle Let (¯ x, ¯ u) be an optimal solution. The solution ¯ p of

  • −˙

¯ p(t) = fx(t, ¯ x(t), ¯ u(t))T¯ p(t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies for all t i) ¯ p(0), k0 ≤ 0 ∀k0 ∈ T♭

K0(¯

x(0)) ii) ¯ p(t), f(t, ¯ x(t), ¯ u(t)) = maxu∈U(t) ¯ p(t), f(t, ¯ x(t), u).

◮ Pointwise necessary condition

slide-7
SLIDE 7

INTRODUCTION

Maximum Principle Let (¯ x, ¯ u) be an optimal solution. The solution ¯ p of

  • −˙

¯ p(t) = Hx[t](t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies

  • 1. ¯

p(0), k0 ≤ 0 ∀k0 ∈ T♭

K0(¯

x(0))

  • 2. H[t] =

maxu∈U(t) H(t, ¯ x(t), ¯ p(t), u).

◮ Pointwise necessary condition ◮ H(t, x, p, u) = p, f(t, x, u) ◮ [t] = (t, ¯

x(t), ¯ p(t), ¯ u(t))

slide-8
SLIDE 8

INTRODUCTION

Maximum Principle Let (¯ x, ¯ u) be an optimal solution. The solution ¯ p of

  • −˙

¯ p(t) = Hx[t](t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies

  • 1. ¯

p(0) ∈ N♭

K0(¯

x(0))

  • 2. H[t] =

maxu∈U(t) H(t, ¯ x(t), ¯ p(t), u).

◮ Pointwise necessary condition ◮ H(t, x, p, u) = p, f(t, x, u) ◮ N♭ K(x) =

  • q
  • q, k ≤ 0

∀k ∈ T♭

K(x)

slide-9
SLIDE 9

INTRODUCTION

Maximum Principle Let (¯ x, ¯ u) be an optimal solution. The solution ¯ p of

  • −˙

¯ p(t) = Hx[t](t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies

  • 1. ¯

p(0) ∈ N♭

K0(¯

x(0))

  • 2. H[t] =

maxu∈U(t) H(t, ¯ x(t), ¯ p(t), u).

◮ Pointwise necessary condition ◮ H(t, x, p, u) = p, f(t, x, u) ◮ N♭ K(x) =

  • q
  • q, k ≤ 0

∀k ∈ T♭

K(x)

  • Sensitivity Relations

Let ¯ x be an optimal solution. The solution ¯ p of

  • −˙

¯ p(t) = Hx[t](t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies −¯ p(t) ∈ ∂+

x V(t, ¯

x(t)) ∀ t.

◮ ∂+ x is the superdifferential in

x.

slide-10
SLIDE 10

INTRODUCTION II

2nd-order pointwise conditions

◮ Goh conditions

◮ U is time independent ◮ Requires structural

assumptions on U

◮ Jacobson conditions

◮ continuous optimal control ◮ optimal control in interior

  • f U

◮ U is time independent

slide-11
SLIDE 11

INTRODUCTION II

2nd-order pointwise conditions

◮ Goh conditions

◮ U is time independent ◮ Requires structural

assumptions on U

◮ Jacobson conditions

◮ continuous optimal control ◮ optimal control in interior

  • f U

◮ U is time independent

2nd-order sensitivity relations

◮ Investigated using matrix

Riccati equations

◮ Relationship with regularity

  • f the value function
slide-12
SLIDE 12

INTRODUCTION II

2nd-order pointwise conditions

◮ Goh conditions

◮ U not time dependent ◮ Requires structural

assumptions on U

◮ Jacobson conditions

◮ continuous optimal control ◮ optimal control in interior

  • f U

◮ U not time dependent

2nd-order sensitivity relations

◮ Investigated using matrix

Riccati equations

◮ Relationship with regularity

  • f the value function

◮ Link to higher order

necessary conditions?

slide-13
SLIDE 13

PRELIMINARIES

Tangent cone T♭

K(x) = Liminf h→0+

K − x h 2nd-order tangent T♭(2)

K

(x, u) = Liminf

h→0+

K − hu − x h2 Normal cone N♭

K(x) =

  • q
  • q, u ≤ 0

∀u ∈ T♭

K(x)

  • 2nd-order normal

N♭(2)

K

(x, q) =

  • Q ∈ S
  • q, v + 1

2 Qu, u ≤ 0 ∀u ∈ T♭

K(x) ∩ {q}⊥, v ∈ T♭(2) K

(x, u)

slide-14
SLIDE 14

PRELIMINARIES

Tangent cone T♭

K(x) = Liminf h→0+

K − x h 2nd-order tangent T♭(2)

K

(x, u) = Liminf

h→0+

K − hu − x h2 Normal cone N♭

K(x) =

  • q
  • q, u ≤ 0

∀u ∈ T♭

K(x)

  • 2nd-order normal

N♭(2)

K

(x, q) =

  • Q ∈ S
  • q, v + 1

2 Qu, u ≤ 0 ∀u ∈ T♭

K(x) ∩ {q}⊥, v ∈ T♭(2) K

(x, u)

  • Superjet

J2,+f(x) =

  • (q, Q)
  • f(y) − f(x) ≤ q, y − x + 1

2 Q(y − x), y − x + o(|y − x|2), ∀y

  • Subjet

J2,−f(x) =

  • (q, Q)
slide-15
SLIDE 15

SECOND-ORDER MAXIMUM PRINCIPLE

Theorem.

Let (¯ x, ¯ u, ¯ p) be an optimal solution and adjoint state. Then for every Ψ ∈ S such that (∇ϕ(¯ x(1)), Ψ) ∈ J2,+ϕ(¯ x(1)), the solution W of ˙ W(t) = −Hpx[t]W(t) − W(t)Hxp[t] − Hxx[t], W(1) = −Ψ, satisfies i) W(0) ∈ N♭(2)

K0 (¯

x(0); ¯ p(0)) ii) max

(v,M)∈¯ F(t)

  • MT¯

p(t) + W(t)v, v

  • = 0,

a.e. in [0, 1].

Notation

¯ F(t) = co

  • (f(t, ¯

x(t), u), fx(t, ¯ x(t), u)) − (f[t], fx[t]))

  • u ∈ ¯

U(t))

  • ¯

U(t) =

  • z ∈ U(t)
  • z ∈ arg maxu∈U(t) H(t, ¯

x(t), ¯ p(t), u)

slide-16
SLIDE 16

SECOND-ORDER SENSITIVITY RELATIONS

Theorem (Backward propagation).

Let ¯ x be optimal with ¯ x(t0) = x0 and ¯ p, W and Ψ be as in the maximum principles. Then (−¯ p(t), −W(t)) ∈ J2,+

x

V(t, ¯ x(t)), ∀ t ∈ [t0, 1].

slide-17
SLIDE 17

SECOND-ORDER SENSITIVITY RELATIONS

Theorem (Backward propagation).

Let ¯ x be optimal with ¯ x(t0) = x0 and ¯ p, W and Ψ be as in the maximum principles. Then (−¯ p(t), −W(t)) ∈ J2,+

x

V(t, ¯ x(t)), ∀ t ∈ [t0, 1].

Theorem (Forward propagation).

Let ¯ x and ¯ p be as above. If for some W0 ∈ S we have (−¯ p(t0), −W0) ∈ J2,−

x

V(t0, x0). Then for ˙ W(t) + Hpx[t]W(t) + W(t)Hxp[t] + Hxx[t] = 0, W(t0) = W0, the following sensitivity relation holds true: (−¯ p(t), −W(t)) ∈ J2,−

x

V(t, ¯ x(t)), ∀ t ∈ [t0, 1].

slide-18
SLIDE 18

JACOBSON TYPE NECESSARY CONDITIONS I

Relaxed Assumption:

◮ f(t, ·, ·) is twice differentiable with Lipschitz derivatives ◮ ϕ is twice differentiable

slide-19
SLIDE 19

JACOBSON TYPE NECESSARY CONDITIONS I

Relaxed Assumption:

◮ f(t, ·, ·) is twice differentiable with Lipschitz derivatives ◮ ϕ is twice differentiable

Objectives:

  • 1. Replace
  • z ∈ U(t)
  • z ∈ arg maxu∈U(t) H(t, ¯

x(t), ¯ p(t), u)

  • with more “explicit” set (tangents)
  • 2. Pass to the limit in the expression

max

(v,M)∈¯ F(t)

  • MT¯

p(t) + W(t)v, v

  • = 0,

a.e. in [0, 1]

slide-20
SLIDE 20

JACOBSON TYPE NECESSARY CONDITIONS II

Theorem.

Let ¯ x, ¯ u, ¯ p and W be as in the second-order maximum principle with Ψ := ϕ′′(¯ x(1)). For a.e. t ∈ [0, 1] and for every u ∈ T♭

U(t)(¯

u(t)) such that either (i) Hu[t] = 0, Hu[t]u = 0 and Hu[t]v + 1

2uTHuu[t]u = 0 for some

v ∈ T♭(2)

U(t)(¯

u(t), u),

  • r

(ii) Hu[t] = 0 and uTHuu[t]u = 0, we have

  • fu[t]T (Hux[t] + W(t)fu[t]) u, u
  • ≤ 0.
slide-21
SLIDE 21

INEQUALITY CONSTRAINTS

Assumption:

◮ U(t) = s

j=1

  • u
  • cj(t, u) ≤ 0
  • ,

◮ cj twice differentiable in u ◮ {∇ucj(t, ¯

u(t))}s

j=1 are linearly independent

slide-22
SLIDE 22

INEQUALITY CONSTRAINTS

Assumption:

◮ U(t) = s

j=1

  • u
  • cj(t, u) ≤ 0
  • ,

◮ cj twice differentiable in u ◮ {∇ucj(t, ¯

u(t))}s

j=1 are linearly independent

Corollary.

Let ¯ x, ¯ u, ¯ p and W be as in the Jacobson conditions. Then there exist measurable, uniquely defined αj : [0, 1] → R+, such that for a.e. t, (i) αj(t)cj(t, ¯ u(t)) = 0 for all j ∈ {1, . . . , s}; (ii) Hu[t] = s

j=1 αj(t)∇ucj(t, ¯

u(t)); (iii) maxu∈U0(t)

  • fu[t]T (Hux[t] + W(t)fu[t]) u, u
  • = 0.

U0(t) :=   u ∈ T♭

U(t)(¯

u(t))

  • Hu[t]u = 0 and uT

 Huu[t] −

s

  • j=1

αj(t)cj

uu(t, ¯

u(t))   u = 0    .

slide-23
SLIDE 23

EXAMPLE PART I

◮ f(x, u) =

  • u1

u1 + u2 −x1x2 + 9u2

2

T

◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =

  • u ∈ R2

0 ≤ u2 ≤ u1 ≤ 1

slide-24
SLIDE 24

EXAMPLE PART I

◮ f(x, u) =

  • u1

u1 + u2 −x1x2 + 9u2

2

T

◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =

  • u ∈ R2

0 ≤ u2 ≤ u1 ≤ 1

  • Let ¯

x, ¯ u be a candidate for optimality. Observations I

◮ ¯

x1 and ¯ x2 are non negative and increasing

◮ ¯

p3 ≡ −1

◮ ¯

p1 and ¯ p2 are non negative and decreasing

slide-25
SLIDE 25

EXAMPLE PART I

◮ f(x, u) =

  • u1

u1 + u2 −x1x2 + 9u2

2

T

◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =

  • u ∈ R2

0 ≤ u2 ≤ u1 ≤ 1

  • Let ¯

x, ¯ u be a candidate for optimality. Observations I

◮ ¯

x1 and ¯ x2 are non negative and increasing

◮ ¯

p3 ≡ −1

◮ ¯

p1 and ¯ p2 are non negative and decreasing Hamiltonian

◮ H[t] = (¯

p1(t)+¯ p2(t))¯ u1(t)+(¯ p2(t) − 9¯ u2(t)) ¯ u2(t)+¯ x1(t)¯ x2(t)

slide-26
SLIDE 26

EXAMPLE PART II

◮ f(x, u) =

  • u1

u1 + u2 −x1x2 + 9u2

2

T

◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =

  • u ∈ R2

0 ≤ u2 ≤ u1 ≤ 1

  • Case 1: ¯

u1 ≡ 0

◮ (¯

x, ¯ u) ≡ 0, (¯ p1, ¯ p2) ≡ 0, Hu[t] = 0

slide-27
SLIDE 27

EXAMPLE PART II

◮ f(x, u) =

  • u1

u1 + u2 −x1x2 + 9u2

2

T

◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =

  • u ∈ R2

0 ≤ u2 ≤ u1 ≤ 1

  • Case 1: ¯

u1 ≡ 0

◮ (¯

x, ¯ u) ≡ 0, (¯ p1, ¯ p2) ≡ 0, Hu[t] = 0 Critical Directions:

◮ uTHuu[t]u = 0 for all u ∈

  • v ∈ T♭

U(0)

  • v2 = 0
slide-28
SLIDE 28

EXAMPLE PART II

◮ f(x, u) =

  • u1

u1 + u2 −x1x2 + 9u2

2

T

◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =

  • u ∈ R2

0 ≤ u2 ≤ u1 ≤ 1

  • Case 1: ¯

u1 ≡ 0

◮ (¯

x, ¯ u) ≡ 0, (¯ p1, ¯ p2) ≡ 0, Hu[t] = 0 Critical Directions:

◮ uTHuu[t]u = 0 for all u ∈

  • v ∈ T♭

U(0)

  • v2 = 0
  • Second-order maximality condition:

◮ 0 !

≥ W(t)fu[t]u, fu[t]u = 2u2

1(1 − t)

slide-29
SLIDE 29

EXAMPLE PART III

◮ f(x, u) =

  • u1

u1 + u2 −x1x2 + 9u2

2

T

◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =

  • u ∈ R2

0 ≤ u2 ≤ u1 ≤ 1

  • Case 2: ¯

u2 > 0 on set of positive measure

◮ For all t, only one tuple (¯

u1, ¯ u2) satisfies the (first-oder) maximality condition.

slide-30
SLIDE 30

EXAMPLE PART III

◮ f(x, u) =

  • u1

u1 + u2 −x1x2 + 9u2

2

T

◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =

  • u ∈ R2

0 ≤ u2 ≤ u1 ≤ 1

  • Case 2: ¯

u2 > 0 on set of positive measure

◮ For all t, only one tuple (¯

u1, ¯ u2) satisfies the (first-oder) maximality condition. Conclusion: Only candidate who satisfies first- and second

  • rder conditions is a global optimum.
slide-31
SLIDE 31

CONCLUSIONS AND FUTURE WORK

Main results

◮ Second-order maximum principle for general control

constraints

◮ Jacobson type optimality conditions without structural

assumptions on the optimal control

◮ Second-order sensitivity relations analogous the the

first-order case

slide-32
SLIDE 32

CONCLUSIONS AND FUTURE WORK

Main results

◮ Second-order maximum principle for general control

constraints

◮ Jacobson type optimality conditions without structural

assumptions on the optimal control

◮ Second-order sensitivity relations analogous the the

first-order case Future work

◮ General terminal constraints ◮ Sufficient conditions ◮ State constraints

slide-33
SLIDE 33

CONCLUSIONS AND FUTURE WORK

Main results

◮ Second-order maximum principle for general control

constraints

◮ Jacobson type optimality conditions without structural

assumptions on the optimal control

◮ Second-order sensitivity relations analogous the the

first-order case Future work

◮ General terminal constraints ◮ Sufficient conditions ◮ State constraints

Thank you