SLIDE 1 Pointwise second-order necessary optimality conditions and sensitivity relations
Nonlinear Partial Differential Equations and Applications Daniel Hoehener
MIT
joint work with H´ el` ene Frankowska
Paris, June 20, 2016
SLIDE 2
OPTIMAL CONTROL PROBLEM
Problem: Minimize ϕ(x(1)) ˙ x(t) = f(t, x(t), u(t)) ∀ t u(t) ∈ U(t) ∀ t x(0) ∈ K0.
SLIDE 3
OPTIMAL CONTROL PROBLEM
Problem: Minimize ϕ(x(1)) ˙ x(t) = f(t, x(t), u(t)) ∀ t u(t) ∈ U(t) ∀ t x(0) ∈ K0. Assumptions:
◮ f satisfies standard
assumptions and is twice differentiable in x with Lipschitz derivative
◮ ϕ is differentiable ◮ K0 and U(t) are closed and
nonempty
SLIDE 4
OPTIMAL CONTROL PROBLEM
Problem: Minimize ϕ(x(1)) ˙ x(t) = f(t, x(t), u(t)) ∀ t u(t) ∈ U(t) ∀ t x(0) ∈ K0. Assumptions:
◮ f satisfies standard
assumptions and is twice differentiable in x with Lipschitz derivative
◮ ϕ is differentiable ◮ K0 and U(t) are closed and
nonempty Value function V(t, x0) = inf {ϕ(x(1)) | x solution of problem starting at (t, x0)} .
SLIDE 5 INTRODUCTION
Maximum Principle Let (¯ x, ¯ u) be an optimal solution. The solution ¯ p of
¯ p(t) = fx(t, ¯ x(t), ¯ u(t))T¯ p(t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies for all t i) ¯ p(0), k0 ≤ 0 ∀k0 ∈ T♭
K0(¯
x(0)) ii) ¯ p(t), f(t, ¯ x(t), ¯ u(t)) = maxu∈U(t) ¯ p(t), f(t, ¯ x(t), u).
SLIDE 6 INTRODUCTION
Maximum Principle Let (¯ x, ¯ u) be an optimal solution. The solution ¯ p of
¯ p(t) = fx(t, ¯ x(t), ¯ u(t))T¯ p(t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies for all t i) ¯ p(0), k0 ≤ 0 ∀k0 ∈ T♭
K0(¯
x(0)) ii) ¯ p(t), f(t, ¯ x(t), ¯ u(t)) = maxu∈U(t) ¯ p(t), f(t, ¯ x(t), u).
◮ Pointwise necessary condition
SLIDE 7 INTRODUCTION
Maximum Principle Let (¯ x, ¯ u) be an optimal solution. The solution ¯ p of
¯ p(t) = Hx[t](t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies
p(0), k0 ≤ 0 ∀k0 ∈ T♭
K0(¯
x(0))
maxu∈U(t) H(t, ¯ x(t), ¯ p(t), u).
◮ Pointwise necessary condition ◮ H(t, x, p, u) = p, f(t, x, u) ◮ [t] = (t, ¯
x(t), ¯ p(t), ¯ u(t))
SLIDE 8 INTRODUCTION
Maximum Principle Let (¯ x, ¯ u) be an optimal solution. The solution ¯ p of
¯ p(t) = Hx[t](t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies
p(0) ∈ N♭
K0(¯
x(0))
maxu∈U(t) H(t, ¯ x(t), ¯ p(t), u).
◮ Pointwise necessary condition ◮ H(t, x, p, u) = p, f(t, x, u) ◮ N♭ K(x) =
∀k ∈ T♭
K(x)
SLIDE 9 INTRODUCTION
Maximum Principle Let (¯ x, ¯ u) be an optimal solution. The solution ¯ p of
¯ p(t) = Hx[t](t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies
p(0) ∈ N♭
K0(¯
x(0))
maxu∈U(t) H(t, ¯ x(t), ¯ p(t), u).
◮ Pointwise necessary condition ◮ H(t, x, p, u) = p, f(t, x, u) ◮ N♭ K(x) =
∀k ∈ T♭
K(x)
Let ¯ x be an optimal solution. The solution ¯ p of
¯ p(t) = Hx[t](t) −¯ p(1) = ∇ϕ(¯ x(1)), satisfies −¯ p(t) ∈ ∂+
x V(t, ¯
x(t)) ∀ t.
◮ ∂+ x is the superdifferential in
x.
SLIDE 10 INTRODUCTION II
2nd-order pointwise conditions
◮ Goh conditions
◮ U is time independent ◮ Requires structural
assumptions on U
◮ Jacobson conditions
◮ continuous optimal control ◮ optimal control in interior
◮ U is time independent
SLIDE 11 INTRODUCTION II
2nd-order pointwise conditions
◮ Goh conditions
◮ U is time independent ◮ Requires structural
assumptions on U
◮ Jacobson conditions
◮ continuous optimal control ◮ optimal control in interior
◮ U is time independent
2nd-order sensitivity relations
◮ Investigated using matrix
Riccati equations
◮ Relationship with regularity
SLIDE 12 INTRODUCTION II
2nd-order pointwise conditions
◮ Goh conditions
◮ U not time dependent ◮ Requires structural
assumptions on U
◮ Jacobson conditions
◮ continuous optimal control ◮ optimal control in interior
◮ U not time dependent
2nd-order sensitivity relations
◮ Investigated using matrix
Riccati equations
◮ Relationship with regularity
◮ Link to higher order
necessary conditions?
SLIDE 13 PRELIMINARIES
Tangent cone T♭
K(x) = Liminf h→0+
K − x h 2nd-order tangent T♭(2)
K
(x, u) = Liminf
h→0+
K − hu − x h2 Normal cone N♭
K(x) =
∀u ∈ T♭
K(x)
N♭(2)
K
(x, q) =
2 Qu, u ≤ 0 ∀u ∈ T♭
K(x) ∩ {q}⊥, v ∈ T♭(2) K
(x, u)
SLIDE 14 PRELIMINARIES
Tangent cone T♭
K(x) = Liminf h→0+
K − x h 2nd-order tangent T♭(2)
K
(x, u) = Liminf
h→0+
K − hu − x h2 Normal cone N♭
K(x) =
∀u ∈ T♭
K(x)
N♭(2)
K
(x, q) =
2 Qu, u ≤ 0 ∀u ∈ T♭
K(x) ∩ {q}⊥, v ∈ T♭(2) K
(x, u)
J2,+f(x) =
- (q, Q)
- f(y) − f(x) ≤ q, y − x + 1
2 Q(y − x), y − x + o(|y − x|2), ∀y
J2,−f(x) =
SLIDE 15 SECOND-ORDER MAXIMUM PRINCIPLE
Theorem.
Let (¯ x, ¯ u, ¯ p) be an optimal solution and adjoint state. Then for every Ψ ∈ S such that (∇ϕ(¯ x(1)), Ψ) ∈ J2,+ϕ(¯ x(1)), the solution W of ˙ W(t) = −Hpx[t]W(t) − W(t)Hxp[t] − Hxx[t], W(1) = −Ψ, satisfies i) W(0) ∈ N♭(2)
K0 (¯
x(0); ¯ p(0)) ii) max
(v,M)∈¯ F(t)
p(t) + W(t)v, v
a.e. in [0, 1].
Notation
¯ F(t) = co
x(t), u), fx(t, ¯ x(t), u)) − (f[t], fx[t]))
U(t))
U(t) =
- z ∈ U(t)
- z ∈ arg maxu∈U(t) H(t, ¯
x(t), ¯ p(t), u)
SLIDE 16
SECOND-ORDER SENSITIVITY RELATIONS
Theorem (Backward propagation).
Let ¯ x be optimal with ¯ x(t0) = x0 and ¯ p, W and Ψ be as in the maximum principles. Then (−¯ p(t), −W(t)) ∈ J2,+
x
V(t, ¯ x(t)), ∀ t ∈ [t0, 1].
SLIDE 17
SECOND-ORDER SENSITIVITY RELATIONS
Theorem (Backward propagation).
Let ¯ x be optimal with ¯ x(t0) = x0 and ¯ p, W and Ψ be as in the maximum principles. Then (−¯ p(t), −W(t)) ∈ J2,+
x
V(t, ¯ x(t)), ∀ t ∈ [t0, 1].
Theorem (Forward propagation).
Let ¯ x and ¯ p be as above. If for some W0 ∈ S we have (−¯ p(t0), −W0) ∈ J2,−
x
V(t0, x0). Then for ˙ W(t) + Hpx[t]W(t) + W(t)Hxp[t] + Hxx[t] = 0, W(t0) = W0, the following sensitivity relation holds true: (−¯ p(t), −W(t)) ∈ J2,−
x
V(t, ¯ x(t)), ∀ t ∈ [t0, 1].
SLIDE 18
JACOBSON TYPE NECESSARY CONDITIONS I
Relaxed Assumption:
◮ f(t, ·, ·) is twice differentiable with Lipschitz derivatives ◮ ϕ is twice differentiable
SLIDE 19 JACOBSON TYPE NECESSARY CONDITIONS I
Relaxed Assumption:
◮ f(t, ·, ·) is twice differentiable with Lipschitz derivatives ◮ ϕ is twice differentiable
Objectives:
- 1. Replace
- z ∈ U(t)
- z ∈ arg maxu∈U(t) H(t, ¯
x(t), ¯ p(t), u)
- with more “explicit” set (tangents)
- 2. Pass to the limit in the expression
max
(v,M)∈¯ F(t)
p(t) + W(t)v, v
a.e. in [0, 1]
SLIDE 20 JACOBSON TYPE NECESSARY CONDITIONS II
Theorem.
Let ¯ x, ¯ u, ¯ p and W be as in the second-order maximum principle with Ψ := ϕ′′(¯ x(1)). For a.e. t ∈ [0, 1] and for every u ∈ T♭
U(t)(¯
u(t)) such that either (i) Hu[t] = 0, Hu[t]u = 0 and Hu[t]v + 1
2uTHuu[t]u = 0 for some
v ∈ T♭(2)
U(t)(¯
u(t), u),
(ii) Hu[t] = 0 and uTHuu[t]u = 0, we have
- fu[t]T (Hux[t] + W(t)fu[t]) u, u
- ≤ 0.
SLIDE 21 INEQUALITY CONSTRAINTS
Assumption:
◮ U(t) = s
j=1
◮ cj twice differentiable in u ◮ {∇ucj(t, ¯
u(t))}s
j=1 are linearly independent
SLIDE 22 INEQUALITY CONSTRAINTS
Assumption:
◮ U(t) = s
j=1
◮ cj twice differentiable in u ◮ {∇ucj(t, ¯
u(t))}s
j=1 are linearly independent
Corollary.
Let ¯ x, ¯ u, ¯ p and W be as in the Jacobson conditions. Then there exist measurable, uniquely defined αj : [0, 1] → R+, such that for a.e. t, (i) αj(t)cj(t, ¯ u(t)) = 0 for all j ∈ {1, . . . , s}; (ii) Hu[t] = s
j=1 αj(t)∇ucj(t, ¯
u(t)); (iii) maxu∈U0(t)
- fu[t]T (Hux[t] + W(t)fu[t]) u, u
- = 0.
U0(t) := u ∈ T♭
U(t)(¯
u(t))
Huu[t] −
s
αj(t)cj
uu(t, ¯
u(t)) u = 0 .
SLIDE 23 EXAMPLE PART I
◮ f(x, u) =
u1 + u2 −x1x2 + 9u2
2
T
◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =
0 ≤ u2 ≤ u1 ≤ 1
SLIDE 24 EXAMPLE PART I
◮ f(x, u) =
u1 + u2 −x1x2 + 9u2
2
T
◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =
0 ≤ u2 ≤ u1 ≤ 1
x, ¯ u be a candidate for optimality. Observations I
◮ ¯
x1 and ¯ x2 are non negative and increasing
◮ ¯
p3 ≡ −1
◮ ¯
p1 and ¯ p2 are non negative and decreasing
SLIDE 25 EXAMPLE PART I
◮ f(x, u) =
u1 + u2 −x1x2 + 9u2
2
T
◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =
0 ≤ u2 ≤ u1 ≤ 1
x, ¯ u be a candidate for optimality. Observations I
◮ ¯
x1 and ¯ x2 are non negative and increasing
◮ ¯
p3 ≡ −1
◮ ¯
p1 and ¯ p2 are non negative and decreasing Hamiltonian
◮ H[t] = (¯
p1(t)+¯ p2(t))¯ u1(t)+(¯ p2(t) − 9¯ u2(t)) ¯ u2(t)+¯ x1(t)¯ x2(t)
SLIDE 26 EXAMPLE PART II
◮ f(x, u) =
u1 + u2 −x1x2 + 9u2
2
T
◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =
0 ≤ u2 ≤ u1 ≤ 1
u1 ≡ 0
◮ (¯
x, ¯ u) ≡ 0, (¯ p1, ¯ p2) ≡ 0, Hu[t] = 0
SLIDE 27 EXAMPLE PART II
◮ f(x, u) =
u1 + u2 −x1x2 + 9u2
2
T
◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =
0 ≤ u2 ≤ u1 ≤ 1
u1 ≡ 0
◮ (¯
x, ¯ u) ≡ 0, (¯ p1, ¯ p2) ≡ 0, Hu[t] = 0 Critical Directions:
◮ uTHuu[t]u = 0 for all u ∈
U(0)
SLIDE 28 EXAMPLE PART II
◮ f(x, u) =
u1 + u2 −x1x2 + 9u2
2
T
◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =
0 ≤ u2 ≤ u1 ≤ 1
u1 ≡ 0
◮ (¯
x, ¯ u) ≡ 0, (¯ p1, ¯ p2) ≡ 0, Hu[t] = 0 Critical Directions:
◮ uTHuu[t]u = 0 for all u ∈
U(0)
- v2 = 0
- Second-order maximality condition:
◮ 0 !
≥ W(t)fu[t]u, fu[t]u = 2u2
1(1 − t)
SLIDE 29 EXAMPLE PART III
◮ f(x, u) =
u1 + u2 −x1x2 + 9u2
2
T
◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =
0 ≤ u2 ≤ u1 ≤ 1
u2 > 0 on set of positive measure
◮ For all t, only one tuple (¯
u1, ¯ u2) satisfies the (first-oder) maximality condition.
SLIDE 30 EXAMPLE PART III
◮ f(x, u) =
u1 + u2 −x1x2 + 9u2
2
T
◮ ϕ(x) = x3 ◮ K0 = {(0, 0, 0)} ◮ U =
0 ≤ u2 ≤ u1 ≤ 1
u2 > 0 on set of positive measure
◮ For all t, only one tuple (¯
u1, ¯ u2) satisfies the (first-oder) maximality condition. Conclusion: Only candidate who satisfies first- and second
- rder conditions is a global optimum.
SLIDE 31
CONCLUSIONS AND FUTURE WORK
Main results
◮ Second-order maximum principle for general control
constraints
◮ Jacobson type optimality conditions without structural
assumptions on the optimal control
◮ Second-order sensitivity relations analogous the the
first-order case
SLIDE 32
CONCLUSIONS AND FUTURE WORK
Main results
◮ Second-order maximum principle for general control
constraints
◮ Jacobson type optimality conditions without structural
assumptions on the optimal control
◮ Second-order sensitivity relations analogous the the
first-order case Future work
◮ General terminal constraints ◮ Sufficient conditions ◮ State constraints
SLIDE 33
CONCLUSIONS AND FUTURE WORK
Main results
◮ Second-order maximum principle for general control
constraints
◮ Jacobson type optimality conditions without structural
assumptions on the optimal control
◮ Second-order sensitivity relations analogous the the
first-order case Future work
◮ General terminal constraints ◮ Sufficient conditions ◮ State constraints
Thank you