An introduction to shape and topology optimization ric Bonnetier - - PowerPoint PPT Presentation

an introduction to shape and topology optimization
SMART_READER_LITE
LIVE PREVIEW

An introduction to shape and topology optimization ric Bonnetier - - PowerPoint PPT Presentation

An introduction to shape and topology optimization ric Bonnetier and Charles Dapogny Institut Fourier, Universit Grenoble-Alpes, Grenoble, France CNRS & Laboratoire Jean Kuntzmann, Universit Grenoble-Alpes, Grenoble,


slide-1
SLIDE 1

An introduction to shape and topology optimization

Éric Bonnetier∗ and Charles Dapogny†

∗ Institut Fourier, Université Grenoble-Alpes, Grenoble, France † CNRS & Laboratoire Jean Kuntzmann, Université Grenoble-Alpes, Grenoble, France

Fall, 2020

1 / 72

slide-2
SLIDE 2

Foreword

  • In this lecture, we focus on parametric optimization, or optimal control:
  • The shape is described by a set h of parameters, lying in a fixed vector space.
  • The state equations, accounting for the physical behavior of the shape,

depend on h in a “simple” way.

  • Many key concepts and methods of the course can be exposed in this framework,

with a minimum amount of technicality.

  • <latexit sha1_base64="eomrWUK6OaVvc7Eq0kx7vCymfLs=">ACynicjVHLSsNAFD2Nr1pfVZdugkVwFZLUYt0V3LhwUcE+oC2SpNMaOk3CZCKU4s4fcKsfJv6B/oV3xhR0UXRCkjPn3nNm7r1+wsNU2vZ7wVhZXVvfKG6WtrZ3dvfK+wftNM5EwFpBzGPR9b2U8TBiLRlKzrqJYN7U56zjTy5VvPARBrG0a2cJWw9cZROAoDTxLV6fsZ50zelSu2ZTu1mnthErCrbr2mgYKmo4FtV5CvZlx+Qx9DxAiQYQqGCJIwh4eUnh4c2EiIG2BOnCAU6jDI0qkzSiLUYZH7IS+Y9r1cjaivfJMtTqgUzi9gpQmTkgTU54grE4zdTzTzopd5j3XnupuM/r7udeUWIl7Yv/SLTL/q1O1SIxQ1zWEVFOiGVdkLtkuivq5uaPqiQ5JMQpPKS4IBxo5aLPptakunbVW0/HP3SmYtU+yHMzfKpb0oAXUzSXg7ZrOVXLvTmrNKx81EUc4RinNM9zNHCFJlq6yme84NW4NoQxM+bfqUYh1xzi1zKevgB6gJC</latexit>

x

<latexit sha1_base64="k9Q3pBFeuA5cDIvVUYehe1RYWM=">ACxHicjVHLSsNAFD2Nr1pfVZdugkVwFSbRYt0VBHZgn1ALZKk0xqaJiEzEUvRH3Cr3yb+gf6Fd8YUdF0QpIz595zZu69XhIGQjL2XjCWldW14rpY3Nre2d8u5eW8RZ6vOWH4dx2vVcwcMg4i0ZyJB3k5S7Ey/kHW98oeKde56KI6u5Th/Yk7ioJh4LuSqObDbnCLGZXq865SYCdOLWqBgqatgaMVZCvRlx+w0GiOEjwQcESThEC4EPT3YEiI62NGXEo0HGOR5RIm1EWpwyX2DF9R7Tr5WxEe+UptNqnU0J6U1KaOCJNTHkpYXWaqeOZdlbsIu+Z9lR3m9Lfy70mxErcEfuXbp75X52qRWKImq4hoJoSzajq/Nwl01RNzd/VCXJISFO4QHFU8K+Vs7bGqN0LWr3ro6/qEzFav2fp6b4VPdkgY8n6K5GLQdyz6xnOZpW7loy7iAIc4pnmeoY4rNDS3s94watxaYSGMLvVKOQa/bxaxlPX86Sj5o=</latexit>

h(x)

<latexit sha1_base64="13lg4KIAoBzLJKbzNZh3GUYJ0Q=">ACx3icjVHLSsNAFD2Nr/qunQTLIJuwqS12u4KbnRXwVpBRZI4bYemSZhMpKW48Afc6p+Jf6B/4Z0xBV2ITkhy59xzsy9109CkSrG3grWzOzc/EJxcWl5ZXVtvbSxeZHGmQx4O4jDWF76XspDEfG2Eirkl4nk3tAPecfHOt857LVMTRuRon/Gbo9SLRFYGnNTfG+3flsrMadRrNcZs5rhuldUOKWCVxmHdtV2HmVGvlpx6RXuEOMABmG4IigKA7hIaXnCi4YEsJuMCFMUiRMnuMBS6TNiMWJ4RE6oG+Pdlc5GtFe6ZGHdApIb2SlDZ2SRMT1KsT7NPjPOGv3Ne2I89d3G9PdzryGhCn1C/9JNmf/V6VoUuqibGgTVlBhEVxfkLpnpir65/a0qRQ4JYTq+o7ykODKaZ9to0lN7bq3nsm/G6ZG9T7IuRk+9C1pwNMp2r8HFxXHrTqVs4Ny08lHXcQ2drBH8zxCEydoU3efTzhGS/WqRVb9boi2oVcs0Wfizr8RMwcpCJ</latexit>

S

<latexit sha1_base64="Ujgbdjtnry/PfeSsz6jUyp8ng8=">ACz3icjVHLTsJAFD3WF+ILdemkZi4IgUkuCRx4xKiPBIgZjoM0NBX2qmGEIxbf8Ct/pXxD/QvDOWRBdEp2l75tx7zsy91w5dJ5aW9b5irK6tb2xmtrLbO7t7+7mDw1YcJBEXTR64QdSxWSxcxdN6UhXdMJIM92RdueXKp4+05EsRP4N3Iair7HRr4zdDiTRPVmPY/JMWeueT2/zeWtgmVy+WqUCxUilpoKBZ1MCy8khXPci9oYcBAnAk8CDgQxJ2wRDT0URFkLi+pgRFxFydFxgjixpE8oSlMGIndB3RLtuyvq0V56xVnM6xaU3IqWJU9IElBcRVqeZOp5oZ8Uu85pT3W3Kf3t1MsjVmJM7F+6ReZ/daoWiSEudA0O1RqRlXHU5dEd0Xd3PxRlSHkDiFBxSPCHOtXPTZ1JpY1656y3T8Q2cqVu15mpvgU92SBryYorkctEqFYrlQapzna4V01Bkc4wRnNM8qarhCHU3yDvGMF7waDePeDAev1ONlVRzhF/LePoCXQ+UKw=</latexit>

An elastic plate can be described by its height h : S → R with respect to a fixed cross-section S.

2 / 72

slide-3
SLIDE 3

Part II Optimal control and parametric optimization problems

1 Parametric optimization problems

Presentation of the model problem Non existence of optimal design Calculation of the derivative of the objective function The formal method of Céa

2 Numerical algorithms

3 / 72

slide-4
SLIDE 4

A model problem involving the conductivity equation (I)

  • We return to the problem of optimizing the ther-

mal conductivity h : D → R.

  • The temperature uh is the solution in H1(D) to

the “state”, conductivity equation:    −div(h∇uh) = f in D, uh =

  • n ΓD,

h ∂uh

∂n

= g

  • n ΓN,

where f ∈ L2(D) and g ∈ L2(ΓN).

ΓD ΓN D g The considered cavity

  • The set Uad of design variables is:

Uad = {h ∈ L∞(D), α ≤ h(x) ≤ β a.e. x ∈ D} ⊂ L∞(D), where 0 < α < β are fixed bounds.

4 / 72

slide-5
SLIDE 5

A model problem involving the conductivity equation (II)

  • We consider a problem of the form:

min

h∈Uad J(h), where J(h) =

  • D

j(uh) dx, and j : R → R is a smooth function satisfying growth conditions: ∀s ∈ R, |j(s)|≤ C(1 + |s|2), and j′(s) ≤ C(1 + |s|).

  • Many variants are possible, e.g. featuring constraints on h or uh.
  • In this simple setting,
  • the state uh is evaluated on the same domain D, regardless of the actual

value of the design variable h ∈ Uad;

  • the design variable h acts as a parameter in the coefficients of the state

equation.

  • Even in this simple case, the optimization problem has no (global) solution in

general...

5 / 72

slide-6
SLIDE 6

Part II Optimal control and parametric optimization problems

1 Parametric optimization problems

Presentation of the model problem Non existence of optimal design Calculation of the derivative of the objective function The formal method of Céa

2 Numerical algorithms

6 / 72

slide-7
SLIDE 7

Non existence of optimal design (I)

  • This counter-example is discussed in details in [All] §5.2.
  • The defining domain is the unit square D = (0, 1)2.
  • We consider two physical situations:

     −div(h∇uh,1) = 0 in D, h

∂uh,1 ∂n

= e1 · n in ΓN,1, h

∂uh,1 ∂n

= 0 in ΓN,2, and      −div(h∇uh,2) = 0 in D, h

∂uh,2 ∂n

= 0 in ΓN,1, h

∂uh,2 ∂n

= e2 · n in ΓN,2.

D ΓN,1 ΓN,2 D D uh,1 uh,2

(Left) Boundary conditions, (middle) boundary data for uh,1; (right) boundary data for uh,2.

7 / 72

slide-8
SLIDE 8

Non existence of optimal design (II)

The optimization problem of interest is: min

h∈Uad J(h),

where the considered objective function is: J(h) =

  • ΓN,1

e1 · n uh,1 ds −

  • ΓN,2

e2 · n uh,2 ds, and the set Uad of admissible designs is augmented with a volume constraint: Uad =

  • h ∈ L∞(D), α < h(x) < β a.e. x ∈ D,
  • D h dx = VT
  • .

In other terms, one aims to

  • Minimize the temperature difference between the left and right sides in Case 1.
  • Maximize the temperature difference between the top and bottom sides in Case 2.

8 / 72

slide-9
SLIDE 9

Non existence of optimal design (III)

Theorem 1.

The parametric optimization problem min

h∈Uad J(h) does not have a global solution.

Hint of the proof: The proof unfolds in three stages: Step 1: One calculates a lower bound m on the values of J(h) for h ∈ Uad: ∀h ∈ Uad, J(h) ≥ m. Step 2: One proves that this lower bound is not attained by an element in Uad: ∀h ∈ Uad, J(h) > m. Step 3: One constructs a minimizing sequence of designs hn ∈ Uad: J(hn)

n→∞

− − − → m. Hence, m is the infimum of J(h) over Uad but it is not attained by any h ∈ Uad.

9 / 72

slide-10
SLIDE 10

Non existence of optimal design (IV)

The minimizing sequence is constructed as a laminate, i.e. a succession of layers with maximum and minimum conductivities.

· · ·

1 n

α β

Two elements in a minimizing sequence hn of conductivities.

Homogenization effect: To get more optimal, designs tend to create very thin structures, at the microscopic level.

10 / 72

slide-11
SLIDE 11

Non existence of optimal design (V)

  • In general, shape optimization problems, even under their simplest forms, do not

have global solutions, for deep physical reasons.

  • See [Mu] for many such examples of non existence of optimal design in optimal

control problems.

  • To ensure existence of an optimal shape, two techniques are usually employed:
  • Relaxation: the set Uad of admissible designs is enlarged so that it contains

“microscopic designs”: this is the essence of the Homogenization method for

  • ptimal design [All2].
  • Restriction: the set Uad is restricted to, e.g. more regular designs.
  • In practice, we shall be interested in the search of local minimizers of such

problems, which are e.g. “close” to an initial design inspired by intuition.

11 / 72

slide-12
SLIDE 12

Part II Optimal control and parametric optimization problems

1 Parametric optimization problems

Presentation of the model problem Non existence of optimal design Calculation of the derivative of the objective function The formal method of Céa

2 Numerical algorithms

12 / 72

slide-13
SLIDE 13

Derivative of the objective function (I)

Let us return to our (further simplified) problem: min

h∈Uad J(h),

where J(h) =

  • D

j(uh) dx, the set of admissible designs is: Uad = {h ∈ L∞(D), α ≤ h(x) ≤ β a.e. x ∈ D} , and the temperature uh is the solution in H1

0(D) to:

−div(h∇uh) = f in D, uh =

  • n ∂D.

Remark Again, for simplicity, we omit constraints on h or uh. ΓD D

13 / 72

slide-14
SLIDE 14

Derivative of the objective function (II)

To solve this program numerically, we intend to apply a gradient-based algorithm, of the form: Initialization: Start from an initial design h0, For n = 0, ... convergence: ❶ Calculate the derivative J′(hn) of the mapping h → J(h) at h = hn; ❷ Select an appropriate time step τ n > 0; ❸ Update the design as: hn+1 = hn − τ nJ′(hn). The cornerstone in the practice of any such method is the calculation of the derivative of J(h).

14 / 72

slide-15
SLIDE 15

Derivative of the objective function (III)

Theorem 2.

The objective function J(h) =

  • D

j(uh) dx is Fréchet differentiable at any h ∈ Uad, and its derivative reads ∀ h ∈ L∞(D), J′(h)( h) =

  • D

(∇uh · ∇ph) h dx, where the adjoint state ph ∈ H1

0(D) is the unique solution to the system:

−div(h∇ph) = −j′(uh) in D, ph = 0

  • n ∂D.

15 / 72

slide-16
SLIDE 16

Derivative of the objective function (IV)

Proof: The proof is divided into three steps: ❶ Using the implicit function theorem, we prove that the state mapping Uad ∋ h − → uh ∈ H1

0(D)

is Fréchet differentiable, with derivative h → u′

h(

h). (Here the fact that all the uh belong to a fixed functional space is handy) ❷ We calculate the derivative of J(h) by using the chain rule. ❸ We give a more convenient structure to this derivative, introducing an adjoint state ph to eliminate the occurrence of u′

h(

h). Step 1: Differentiability of h → uh: For any h ∈ Uad, uh is the unique solution in H1

0(D) to the variational problem:

∀v ∈ H1

0(D),

  • D

h∇uh · ∇v dx =

  • D

fv dx.

16 / 72

slide-17
SLIDE 17

Derivative of the objective function (V)

Let F : Uad × H1

0(D) → H−1(D)

be the mapping defined by: F(h, u) : v →

  • D

h∇u · ∇v dx −

  • D

fv dx. One verifies that

  • F is a mapping of class C1;
  • For given h ∈ Uad, uh is the unique solution u to the equation

F(h, u) = 0.

  • The differential of the partial mapping u → F(h, u) reads:

H1

0(D) ∋

u − →

  • v →
  • D

h∇ u · ∇v dx

  • ∈ H−1(D);

it is an isomorphism, owing to the Lax-Milgram theorem: For all g ∈ H−1(D), there exists a unique u ∈ H1

0(D) s.t.

∀v ∈ H1

0(D),

  • D

h∇u · ∇v dx = g, vH−1(D),H1

0 (D).

17 / 72

slide-18
SLIDE 18

Derivative of the objective function (VI)

The implicit function theorem guarantees that the mapping h → uh is of class C1. To calculate the derivative h → u′

h(

h), we return to the variational formulation for uh: ∀v ∈ H1

0(D),

  • D

h∇uh · ∇v dx =

  • D

fv dx. Differentiating with respect to h in a direction h ∈ L∞(D) yields:

  • D
  • h∇uh · ∇v dx +
  • D

h∇u′

h(

h) · ∇v dx = 0, and so, for all h ∈ L∞(D), u′

h(

h) is the unique solution in H1

0(D) to:

∀v ∈ H1

0(D),

  • D

h∇u′

h(

h) · ∇v dx = −

  • D
  • h∇uh · ∇v dx.

18 / 72

slide-19
SLIDE 19

Derivative of the objective function (VII)

Step 2: Calculation of the derivative of J(h): Since h → uh is of class C1, the chain rule yields immediately: ∀ h ∈ L∞(D), J′(h)( h) =

  • D

j′(uh)u′

h(

h) dx.

  • This expression is awkward: the dependence

h → J′(h)( h) is not explicit and it is difficult to find a descent direction, i.e. a vector h ∈ L∞(D) such that: J′(h)( h) < 0.

  • Fortunately, the expression of J′(h) can be simplified thanks to the introduction
  • f the adjoint state ph.

19 / 72

slide-20
SLIDE 20

Derivative of the objective function (VIII)

Step 3: Reformulation of J′(h) using an adjoint state: The adjoint state ph is the unique solution in H1

0(D) to the variational problem:

∀v ∈ H1

0(D),

  • D

h∇ph · ∇v dx = −

  • D

j′(uh)v dx, to be compared with the variational formulation for u′

h(

h) ∈ H1

0(D):

∀v ∈ H1

0(D),

  • D

h∇u′

h(

h) · ∇v dx = −

  • D
  • h∇uh · ∇v dx.

Then, we calculate: J′(h)( h) =

  • D

j′(uh)u′

h(

h) dx, = −

  • D

h∇ph · ∇u′

h(

h) dx, = −

  • D

h∇u′

h(

h) · ∇ph dx, =

  • D
  • h∇uh · ∇ph dx.

where the last line uses the variational formulation of u′

h(

h) with ph as test function.

20 / 72

slide-21
SLIDE 21

About the adjoint state

  • The adjoint state ph satisfies

−div(h∇ph) = −j′(uh) in D, ph = 0

  • n ∂D.

It is therefore a “virtual temperature” driven by a source (or sink) equal to the rate

  • f change of the integrand of J(h) at the state described by uh.
  • From the last expression, one obviously obtains a descent direction:
  • h = −∇uh · ∇ph

⇒ J′(h)( h) < 0, which can be interpreted as the power induced by the “virtual temperature” ph.

  • We shall see soon a second interpretation of ph as the Lagrange multiplier

associated to the PDE constraint if we formulate our optimization problem as: min

(h,u)

  • D

j(u) dx s.t. −div(h∇u) = f in D, u = 0

  • n ∂D.

21 / 72

slide-22
SLIDE 22

Part II Optimal control and parametric optimization problems

1 Parametric optimization problems

Presentation of the model problem Non existence of optimal design Calculation of the derivative of the objective function The formal method of Céa

2 Numerical algorithms

22 / 72

slide-23
SLIDE 23

The formal method of Céa

The method of Céa is a formal way to calculate the derivative of J(h). It assumes that the mapping h → uh is differentiable. Let the Lagrangian L : Uad × H1

0(D) × H1 0(D) → R

be defined by: L(h, u, p) =

  • D

j(u) dx

  • Objective function at stake

+

  • D

h∇u · ∇p dx −

  • D

fp dx

  • Enforcement of the PDE constraint −∆u=f

with a Lagrange multiplier p

. In particular, for any p ∈ H1

0(D),

J(h) = L(h, uh, p). For a given h ∈ Uad, we search for the saddle points (u, p) of L(h, ·, ·).

23 / 72

slide-24
SLIDE 24

The formal method of Céa

  • Imposing the partial derivative of L with respect to p to vanish amounts to

∀ p ∈ H1

0(D),

  • D

h∇u · ∇ p dx −

  • D

f p dx = 0; this is the variational formulation for u = uh.

  • Imposing the partial derivative of L with respect to u to vanish amounts to

∀ u ∈ H1

0(D),

  • D

h∇p · ∇ u dx = −

  • D

j′(u) u dx; since u = uh, we recognize the variational formulation for p = ph.

24 / 72

slide-25
SLIDE 25

The formal method of Céa

  • Recall that, for arbitrary

p ∈ H1

0(D),

J(h) = L(h, uh, p).

  • Since we have assumed that h → uh is differentiable, the chain rule yields:

J′(h)( h) = ∂L ∂h (h, uh, p)( h) + ∂L ∂u (h, uh, p)(u′

h(

h)).

  • Now taking

p = ph, the last term in the above right-hand side vanishes: J′(h)( h) = ∂L ∂h (h, uh, ph)( h).

  • The above derivative is simply that of the mapping h →
  • D h∇u · ∇p dx for given

u, p. Hence: J′(h)( h) =

  • D
  • h∇uh · ∇ph dx.

25 / 72

slide-26
SLIDE 26

The formal method of Céa: intuition

u u p J(u) L(α, u, p)

pα (uα, pα) J(uα) J(uα)

  • Physical intuition: The function J(h) is “twisted” into the value L(h, uh, ph) at the

parametrized saddle point L(h, ·, ·), which is easy to differentiate with respect to the parameter.

26 / 72

slide-27
SLIDE 27

Part II Optimal control and parametric optimization problems

1 Parametric optimization problems 2 Numerical algorithms

A refresher about the finite element method A refresher about basic optimization methods Numerical algorithms for parametric optimization

27 / 72

slide-28
SLIDE 28

The finite element method: variational formulations (I)

  • As a model problem, we consider the Laplace equation:

Search for u ∈ H1

0(D) s.t.

−∆u = f in D, u = 0

  • n ∂D,

where f ∈ L2(D) is a given source.

  • The associated variational formulation reads:

Search for u ∈ V s.t. ∀v ∈ V , a(u, v) = ℓ(v), where

  • The Hilbert space V is the Sobolev space H1

0(D);

  • a(·, ·) is the bilinear form on V given by: a(u, v) =
  • D

∇u · ∇v dx;

  • ℓ(·) is the linear form on V defined by: ℓ(v) =
  • D

fv dx.

  • The above variational problem has a unique solution u ∈ V owing to the

Lax-Milgram theorem.

28 / 72

slide-29
SLIDE 29

The finite element method: variational formulations (II)

  • The finite element method consists in searching for an approximation uh to h

inside a finite-dimensional subspace Vh ⊂ V .

  • The exact variational problem is replaced by:

Search for uh ∈ Vh s.t. ∀vh ∈ Vh, a(uh, vh) = ℓ(vh), which is also well-posed owing to the Lax-Milgram theorem.

  • The subscript h refers to the sharpness of the approximation: as h → 0, it is

expected that Vh ≈ V and uh ≈ u.

29 / 72

slide-30
SLIDE 30

Meshing the physical domain (I)

In practice, the domain D is discretized by means of a mesh T , i.e. a covering by simplices (triangles in 2d, tetrahedra in 3d).

30 / 72

slide-31
SLIDE 31

Meshing the physical domain (II)

A mesh T is defined by the datum of:

  • A set of vertices {ai}i=1,...,NV ;
  • A set of (open) simplices {Tj}j=1,...,NT , with vertices in {ai}.

We also require that the mesh T be:

  • Valid: For all simplices Ti, Tj with i = j, Ti ∩ Tj = ∅.
  • Conforming: For all simplices Ti, Tj, the intersection Ti ∩ Tj is either a vertex,
  • r an edge, or a triangle (or a tetrahedron in 3d) of T .

Valid, conforming mesh Non conforming mesh Invalid mesh

31 / 72

slide-32
SLIDE 32

Meshing the physical domain (III)

  • It is often crucial in applications that T has good quality, i.e. that its elements are

close to equilateral.

  • The quality of a simplex T, with edges aj can be evaluated e.g. by the function:

Q(T) = α Vol(T)

  • d(d+1)/2
  • j=1

|aj|2 d

2

, where α ∈ R is such that Q(T) = 1 if T is equilateral and Q(T) = 0 if T is flat.

Bad quality mesh, with nearly flat elements Good quality mesh, with almost regular elements

32 / 72

slide-33
SLIDE 33

Construction of the finite element space Vh (I)

  • In the finite element context, the mesh Th is labelled by the size h of its elements.
  • A basis {ϕ1, ..., ϕNh} of the finite element space Vh is based on Th.

Example: the P0 Finite element method

  • Nh is the number NT of simplices T1, ..., TNh in the mesh;
  • For i = 1, ..., Nh, ϕi is constant on each simplex T ∈ Th and

ϕi(x) = 1 on Ti and ϕi(x) = 0 for x / ∈ Ti.

33 / 72

slide-34
SLIDE 34

Construction of the finite element space Vh (II)

Example: the P1 Finite element method

  • Nh is the number NV of vertices a1, ..., aNh of the mesh;
  • For i = 1, ..., Nh, ϕi is affine in restriction to each triangle T ∈ Th and

ϕi(ai) = 1 and ϕi(aj) = 0 for j = i.

34 / 72

slide-35
SLIDE 35

The finite element method in a nutshell (I)

Introducing the (sought) decomposition of the (sought) function uh on this basis: uh =

Nh

  • j=1

ujϕj, the variational problem becomes an Nh × Nh linear system: KU = F, where

  • U =

   u1 . . . uNh    is the vector of unknowns,

  • K is the stiffness matrix, defined by its entries:

Kij = a(ϕj, ϕi), i, j, = 1, . . . , Nh;

  • F is the right-hand side vector: Fi = ℓ(ϕi).

35 / 72

slide-36
SLIDE 36

The finite element method in a nutshell (II)

Resolution of the Laplace equation with the finite element method on several domains D, using various meshes T .

36 / 72

slide-37
SLIDE 37

Some practical aspects about the finite element method

  • In practice, the discrete finite element system

KU = F is a large Nh × Nh linear system, which is sparse.

  • In realistic examples, its resolution can only be achieved thanks to iterative

methods, such as the Conjugate Gradient algorithm, GMRES, etc.

  • The numerical efficient of such methods depends on the condition number of the

matrix K, which is directly related to the quality of the computational mesh.

  • The resolution of this system can also take advantage of recent Domain

Decomposition methods.

  • In shape optimization algorithms, such systems have to be solved multiple times:

this is the main source of computational burden.

37 / 72

slide-38
SLIDE 38

Final remarks about the finite element method

  • The Finite Element paradigm extends (with some work!) to various frameworks:
  • Mixed variational formulations, like in the case of the Stokes equations;
  • Eigenvalue problems;
  • Non linear PDE, such as the Navier-Stokes equations, or the non linear

elasticity system.

  • To go further, see the introductory and reference monographs [All] and [ErnGue].

38 / 72

slide-39
SLIDE 39

Part II Optimal control and parametric optimization problems

1 Parametric optimization problems 2 Numerical algorithms

A refresher about the finite element method A refresher about basic optimization methods Numerical algorithms for parametric optimization

39 / 72

slide-40
SLIDE 40

Refresher: differential and gradient (I)

Definition 1.

Let (X, || · ||X) be a Banach space. A real-valued function F : X → R is differentiable at u ∈ X if there exists a linear, continuous mapping F ′(u) : X → R such that: F(u + v) = F(u) + F ′(u)(v) + o(||v||), where o(||v||X) ||v||X

h→0

− − − → 0. The linear mapping F ′(u) ∈ X ∗ is the differential, or Fréchet derivative of F at u.

Definition 2.

If in addition X is a Hilbert space (H, ·, ·H), the Riesz representation theorem allows to identify the derivative F ′(u) with an element ∇F(u) ∈ H: ∀v ∈ H, F ′(u)(v) = ∇F(u), vH; ∇F(u) is called the gradient of F at u.

40 / 72

slide-41
SLIDE 41

Refresher: differential and gradient (II)

Physical interpretation: If F is differentiable at u ∈ H, it holds, for “small” τ > 0: ∀ u ∈ H, || u||H ≤ 1, F(u + t u) ≈ F(u) + τ∇F(u), u H, ≤ F(u) + τ||∇F(u)||H, where equality holds if and only if u =

∇F(u) ||∇F(u)||H (Cauchy-Schwarz inequality).

⇒ ∇F(u) (resp. −∇F(u)) is the best ascent (resp. descent) direction for F from u.

  • <latexit sha1_base64="9TQHebm7guF71z2uN6NBTFIhp8=">ACynicjVHLSsNAFD2Nr1pfVZdugkVwFZK60GXBjQsXFewD2iLJdFpDJw8mE6EUd/6AW/0w8Q/0L7wzpqAW0QlJzpx7zp259wapCDPluq8la2l5ZXWtvF7Z2Nza3qnu7rWzJeMt1giEtkN/IyLMOYtFSrBu6nkfhQI3gkm5zreueMyC5P4Wk1TPoj8cRyOQuYrojr9IBeCq5tqzXVcs+xF4BWghmI1k+oL+hgiAUOCBwxFGEBHxk9PXhwkRI3wIw4Sg0cY57VMibk4qTwid2Qt8x7XoFG9Ne58yMm9Epgl5JThtH5ElIJwnr02wTz01mzf6We2Zy6rtN6R8UuSJiFW6J/cs3V/7Xp2tRGOHM1BSTalhdHWsyJKbruib21+qUpQhJU7jIcUlYWac8z7bxpOZ2nVvfRN/M0rN6j0rtDne9S1pwN7PcS6Cdt3xTpz6lVtrOMWoyzjAIY5pnqdo4AJNtEyVj3jCs3VpSWtqzT6lVqnw7OPbsh4+ABEUkhE=</latexit>
  • <latexit sha1_base64="9TQHebm7guF71z2uN6NBTFIhp8=">ACynicjVHLSsNAFD2Nr1pfVZdugkVwFZK60GXBjQsXFewD2iLJdFpDJw8mE6EUd/6AW/0w8Q/0L7wzpqAW0QlJzpx7zp259wapCDPluq8la2l5ZXWtvF7Z2Nza3qnu7rWzJeMt1giEtkN/IyLMOYtFSrBu6nkfhQI3gkm5zreueMyC5P4Wk1TPoj8cRyOQuYrojr9IBeCq5tqzXVcs+xF4BWghmI1k+oL+hgiAUOCBwxFGEBHxk9PXhwkRI3wIw4Sg0cY57VMibk4qTwid2Qt8x7XoFG9Ne58yMm9Epgl5JThtH5ElIJwnr02wTz01mzf6We2Zy6rtN6R8UuSJiFW6J/cs3V/7Xp2tRGOHM1BSTalhdHWsyJKbruib21+qUpQhJU7jIcUlYWac8z7bxpOZ2nVvfRN/M0rN6j0rtDne9S1pwN7PcS6Cdt3xTpz6lVtrOMWoyzjAIY5pnqdo4AJNtEyVj3jCs3VpSWtqzT6lVqnw7OPbsh4+ABEUkhE=</latexit>

u

<latexit sha1_base64="vsMVFOlxX80zjT6utuQAYLAB1Y=">ACxXicjVHLSsNAFD2Nr1pfVZdugkVwFdIq6LgQpdV7ANqkWQ6rUPzIpkUShF/wK3+mvgH+hfeGaegFtEJSc6ce8+Zuf6SAy6bqvBWthcWl5pbhaWlvf2Nwqb+0sjhPGW+yOIjTju9lPBARb0ohA95JUu6FfsDb/uhMxdtjnmYijq7lJOG90BtGYiCYJ4m6yu3bcsV1XL3seVA1oAKzGnH5BTfoIwZDjhAcESThAB4yerqowkVCXA9T4lJCQsc57lEibU5ZnDI8Ykf0HdKua9iI9soz02pGpwT0pqS0cUCamPJSwuo0W8dz7azY37yn2lPdbUJ/3iFxErcEfuXbpb5X52qRWKAU12DoJoSzajqmHJdVfUze0vVUlySIhTuE/xlDTylmfba3JdO2qt56Ov+lMxao9M7k53tUtacDVn+OcB62aUz1yapfHlbpjRl3EHvZxSPM8QR0XaKBJ3gM84gnP1rkVWtIaf6ZaBaPZxbdlPXwA0XuPlA=</latexit>

rF(u)

<latexit sha1_base64="UJIpfgXp7FAkixA5IDM/VQPMqUM=">ACz3icjVHLSsNAFD2Nr1pfVZdugkWom5CoMuCIC5bsA9oi0ym0xqaJiGZKUobv0Bt/pX4h/oX3hnTEtohOSnDn3njNz73Uj30ukb/mjLn5hcWl/HJhZXVtfaO4udVIwjTmos5DP4xbLkuE7wWiLj3pi1YUCzZyfdF0h6cq3rwWceKFwYUcR6I7YoPA63ucSaI6nYC5PjPyum+eVks2ZatlzkLnAyUkK1qWHxBz2E4EgxgkASdgHQ0JPGw5sRMR1MSEuJuTpuMAtCqRNKUtQBiN2SN8B7doZG9BeSZazekUn96YlCb2SBNSXkxYnWbqeKqdFfub90R7qruN6e9mXiNiJa6I/Us3zfyvTtUi0ceJrsGjmiLNqOp45pLqrqibm1+qkuQEadwj+IxYa6V0z6bWpPo2lVvmY6/6UzFqj3PclO8q1vSgJ2f45wFjQPLObQOakelipWNOo8d7KJM8zxGBeok7eER7xhGejZtwYd8b9Z6qRyzTb+LaMhw+rEZMJ</latexit>

Some isolines of a function F : R2 → R and the gradient ∇F(u) ∈ R2 at some point u ∈ R2.

41 / 72

slide-42
SLIDE 42

The gradient algorithm (I)

In a Hilbert space H, we consider the unconstrained minimization problem: min

h∈H J(h),

where J(h) is a differentiable function. Initialization: Start from an initial design h0. For n = 0, ... convergence: ❶ Calculate the derivative J′(hn) of J at hn and the gradient ∇J(hn) ∈ H; infer a descent direction hn = −∇J(hn). ❷ Take a suitable small time step τ n > 0 such that: J(hn + τ n hn) < J(hn). ❸ The new iterate is hn+1 = hn + τ n hn. Return: hn.

42 / 72

slide-43
SLIDE 43

The gradient algorithm (II)

H

<latexit sha1_base64="bJqGROQLI4b7BPJ61b6VaxwU5Q=">ACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZI+qMuCIF2YB9QiyTptA5Ok5CZCKXoD7jVbxP/QP/CO2MKuig6IcmZc+85M/dePxZcKsd5z1lr6xubW/ntws7u3v5B8fCoJ6M0CVg3iESUDHxPMsFD1lVcCTaIE+bNfMH6/v2ljvcfWCJ5F6recxGM28a8gkPEVUp3VbLDlx2lUqw1bA7derxigoe0a4DglZKsdFd9wgzEiBEgxA0MIRVjAg6RnCBcOYuJGWBCXEOImzvCIAmlTymKU4RF7T98p7YZG9Je0qjDugUQW9CShtnpIkoLyGsT7NPDXOml3lvTCe+m5z+vuZ14xYhTti/9ItM/+r07UoTHBhauBU2wYXV2QuaSmK/rm9o+qFDnExGk8pnhCODKZ9to5Gmdt1bz8Q/TKZm9T7IclN86lvSgJdTtFeDXqXsVsuVTq3UrGWjzuMEpzineTbQRAtdI3M17wal1ZwpJW+p1q5TLNMX4t6+kLUHePag=</latexit>

hn

<latexit sha1_base64="AJGJ0HnbuyDgvP3NPnwA/g1EFts=">ACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZI+qMuCmy4r2gdolSdtkPzYjJRShH8Abf6aeIf6F94Z0xBF0UnJDlz7j1n5t7rxj5PpGW954yV1bX1jfxmYWt7Z3evuH/QTaJUeKzjRX4k+q6TMJ+HrCO59Fk/FswJXJ/13Om5ivfumUh4F7JWcwGgTMO+Yh7jiTqcnIb3hVLVtmyGtVqw1TArtcrGiho2hpYVgnZakfFN9xgiAgeUgRgCEJ+3CQ0HMNGxZi4gaYEycIcR1neESBtClMcpwiJ3Sd0y764wNa8E6326BSfXkFKEyekiShPEFanmTqeamfFLvOea091txn93cwrIFZiQuxfukXmf3WqFokRznQNnGqKNaOq8zKXVHdF3dz8UZUkh5g4hYcUF4Q9rVz02dSaRNeueuvo+IfOVKzae1luik91SxrwYormctCtlO1quXJRKzVr2ajzOMIxTmeDTRQhsd8h7jGS94NVpGaKTGw3eqkcs0h/i1jKcvzcWQag=</latexit>

J

<latexit sha1_base64="a3H1GLO4VzX8H2A4cKEcb9abg=">ACxHicjVHLSsNAFD2Nr1pfVZdugkVwVSZ9UJcFQcRVC/YBtUgyndZgmoTMRChFf8Ctfpv4B/oX3hlT0EXRCUnOnHvPmbn3enHgS8XYe85aWV1b38hvFra2d3b3ivsHXRmlCRcdHgVR0vdcKQI/FB3lq0D040S4Uy8QPe/+XMd7DyKRfhReq1kshlN3Evpjn7uKqPbVbHEyow1qtWGrYFTr1cM0NB2DGCshGy1ouIbjBCBI4UwiEUIQDuJD0DOCAISZuiDlxCSHfxAUeUSBtSlmCMlxi7+k7od0gY0Pa09p1JxOCehNSGnjhDQR5SWE9Wm2iafGWbPLvOfGU9tRn8v85oSq3BH7F+6ReZ/dboWhTHOTA0+1RQbRlfHM5fUdEXf3P5RlSKHmDiNRxRPCHOjXPTZNhpate9dU38w2RqVu95lpviU9+SBryYor0cdCtlp1qutGulZi0bdR5HOMYpzbOBJi7RQsd4P+MFr9aFVjSr9TrVymOcSvZT19AVU3j2w=</latexit>
  • <latexit sha1_base64="HLJIt/+uxI2r2hVpmCf03O5RVM=">ACynicjVHLTsJAFD3UF+ILdemkZi4Ii2SCDsSNy5cYCKPBIhphwEbhraZTk0IcecPuNUPM/6B/oV3xpLgug0be+ce86Zuf6sQgS5TgfOWtfWNzK79d2Nnd2z8oHh61kyiVjLdYJCLZ9b2EiyDkLRUowbux5N7UF7zjT650vPIZRJE4Z2axXw9cZhMAqYpwjq9P1UCK7uiyWn7JhlLwX1eq3iuLabISVkqxkV39HEBEYUkzBEUJRLOAhoacHFw5iwgaYEyYpCkye4wkF0qbE4sTwCJ3Qd0y7XoaGtNeiVEzOkXQK0lp4w0EfEkxfo02+RT46zRVd5z46nvNqO/n3lNCV4IPQv3YL5X52uRWGEmqkhoJpig+jqWOaSmq7om9tLVSlyiAnT8ZDykmJmlIs+20aTmNp1bz2T/zRMjeo9y7gpvQtacCLKdqrg3al7F6UK7fVUqOajTqPE5zinOZ5iQau0UTLVPmCV7xZN5a0Ztb8h2rlMs0xfi3r+RtWJpI4</latexit>
  • <latexit sha1_base64="HLJIt/+uxI2r2hVpmCf03O5RVM=">ACynicjVHLTsJAFD3UF+ILdemkZi4Ii2SCDsSNy5cYCKPBIhphwEbhraZTk0IcecPuNUPM/6B/oV3xpLgug0be+ce86Zuf6sQgS5TgfOWtfWNzK79d2Nnd2z8oHh61kyiVjLdYJCLZ9b2EiyDkLRUowbux5N7UF7zjT650vPIZRJE4Z2axXw9cZhMAqYpwjq9P1UCK7uiyWn7JhlLwX1eq3iuLabISVkqxkV39HEBEYUkzBEUJRLOAhoacHFw5iwgaYEyYpCkye4wkF0qbE4sTwCJ3Qd0y7XoaGtNeiVEzOkXQK0lp4w0EfEkxfo02+RT46zRVd5z46nvNqO/n3lNCV4IPQv3YL5X52uRWGEmqkhoJpig+jqWOaSmq7om9tLVSlyiAnT8ZDykmJmlIs+20aTmNp1bz2T/zRMjeo9y7gpvQtacCLKdqrg3al7F6UK7fVUqOajTqPE5zinOZ5iQau0UTLVPmCV7xZN5a0Ztb8h2rlMs0xfi3r+RtWJpI4</latexit>

rJ(hn)

<latexit sha1_base64="r+RmoEDLhahp5XIOV5VfVwawvCA=">AC0HicjVHLSsNAFD2N7/qunQTLELdhElf1p3gRlxVsbWgVSZx1NC8TCaiIhbf8CtfpX4B/oX3hlT0IXohCR3zj3nzNx7ndj3UsnYW8EYGR0bn5icKk7PzM7NlxYWu2mUJa7ouJEfJT2Hp8L3QtGRnvRFL04EDxfHDiDLZU/uBJ6kXhvryJRT/g56F35rlcEtQ/Crnjc3OncnEcrp2UyszaDUajJnMsu0azQpYNWNZs2bYvpVUa+2lHpFUc4RQXGQIhJAU+BI6TmEDYaYsD5uCUso8nRe4A5F0mbEsTghA7oe067wxwNa8U6126RSf3oSUJlZJExEvoVidZup8p0V+pv3rfZUd7uhv5N7BYRKXBD6l27I/K9O1SJxhpauwaOaYo2o6tzcJdNdUTc3v1UlySEmTMWnlE8odrVy2GdTa1Jdu+ot1/l3zVSo2rs5N8OHuiUNeDhF8/egW7XsmlXdrZc36/moJ7GMFVRonuvYxDba6JD3JZ7wjBdjz7g27o2HL6pRyDVL+LGMx0/5d5QB</latexit>

hn+1 = hn − τ nrJ(hn)

<latexit sha1_base64="lrKWrtbFs2aLU96Hf9ofkWdJmSU=">AC6HicjVHdThNBGD2soBVQil56M7EhwRCa3dqE9oKkiTfGq5JYIKFgZpdpu7B/2Z0laZo+gHfeGW9AW/1SYxvAG/BmWGb4AXR2ezMN+c758x8/lZFBbadf8sOY+WVx4/qT1dXVt/9nyjvnisEjLPFCDI3S/NiXhYrCRA10qCN1nOVKxn6kjvzLdyZ/dKXyIkyTj3qaqdNYjpNwFAZSE/pUb0zOZsmONxf7YnKWiN2hliXYSL9SIoP2wTfkOU2XTvEvaDb7bRcT3gV0kA1+mn9N4Y4R4oAJWIoJNCMI0gU/E7gwUVG7BQzYjmj0OYV5liltiRLkSGJXnIec3dSoQn3xrOw6oCnRPxzKgW2qEnJyxmb04TNl9bZoA95z6ynuduUq195xUQ1JkT/pVsw/1dnatEYoWNrCFlTZhFTXVC5lPZVzM3Fvao0HTJiJj5nPmcWOXinYXVFLZ287bS5q8t06BmH1TcEjfmlmzwovi4eCw1fTeNlsH7UavXbW6hld4jW32cw89vEcfA3p/xg/8xC/nwvnifHW+3VGdpUrzEn8N5/st9OKb/g=</latexit>
  • <latexit sha1_base64="TNtM6sisSmcHFa4XYth1KlNRE8=">ACynicjVHLSsNAFD2Nr1pfVZdugkVwVZJa0GXBjQsXFewD2iLJdFpDJw8mE6EUd/6AW/0w8Q/0L7wzpqAW0QlJzpx7zp259/qJCFLlOK8Fa2l5ZXWtuF7a2Nza3inv7rXTOJOMt1gsYtn1vZSLIOItFSjBu4nkXugL3vEn5zreueMyDeLoWk0TPgi9cRSMAuYpojp9PxOCq5tyxak6ZtmLwM1BflqxuUX9DFEDIYMITgiKMICHlJ6enDhICFugBlxklBg4hz3KJE3IxUnhUfshL5j2vVyNqK9zpkaN6NTBL2SnDaOyBOThLWp9kmnpnMmv0t98zk1Heb0t/Pc4XEKtwS+5dvrvyvT9eiMKZqSGgmhLD6OpYniUzXdE3t79UpShDQpzGQ4pLwsw4532jSc1teveib+ZpSa1XuWazO861vSgN2f41wE7VrVPanWruqVRj0fdREHOMQxzfMUDVygiZap8hFPeLYuLWlNrdmn1Crkn18W9bDBxQikhs=</latexit>
  • <latexit sha1_base64="TNtM6sisSmcHFa4XYth1KlNRE8=">ACynicjVHLSsNAFD2Nr1pfVZdugkVwVZJa0GXBjQsXFewD2iLJdFpDJw8mE6EUd/6AW/0w8Q/0L7wzpqAW0QlJzpx7zp259/qJCFLlOK8Fa2l5ZXWtuF7a2Nza3inv7rXTOJOMt1gsYtn1vZSLIOItFSjBu4nkXugL3vEn5zreueMyDeLoWk0TPgi9cRSMAuYpojp9PxOCq5tyxak6ZtmLwM1BflqxuUX9DFEDIYMITgiKMICHlJ6enDhICFugBlxklBg4hz3KJE3IxUnhUfshL5j2vVyNqK9zpkaN6NTBL2SnDaOyBOThLWp9kmnpnMmv0t98zk1Heb0t/Pc4XEKtwS+5dvrvyvT9eiMKZqSGgmhLD6OpYniUzXdE3t79UpShDQpzGQ4pLwsw4532jSc1teveib+ZpSa1XuWazO861vSgN2f41wE7VrVPanWruqVRj0fdREHOMQxzfMUDVygiZap8hFPeLYuLWlNrdmn1Crkn18W9bDBxQikhs=</latexit>
  • <latexit sha1_base64="q4P69e9uFKREqtygAOobf+EbD3w=">ACynicjVHLTsJAFD3UF+ILdemkZi4IgUqjx2JGxcuMJFHAsS0ZcCGoW2mUxNC3PkDbvXDjH+gf+GdsS6IDpN2zPnNn7r1uxP1YWtZ7xlhb39jcym7ndnb39g/yh0edOEyEx9peyEPRc52YcT9gbelLznqRYM7M5azrTi9VvPvAROyHwa2cR2w4cyaBP/Y9RxLVHbgJ50ze5QtWsWqVLthErCrteoFgUaj3qhVzFLR0quAdLXC/BsGCGEhwQzMASQhDkcxPT0UYKFiLghFsQJQr6OMzwiR96EVIwUDrFT+k5o10/ZgPYqZ6zdHp3C6RXkNHFGnpB0grA6zdTxRGdW7KrcC51T3W1OfzfNSNW4p7Yv3xL5X9qhaJMeq6Bp9qijSjqvPSLInuirq5+aMqSRki4hQeUVwQ9rRz2WdTe2Jdu+qto+MfWqlYtfdSbYJPdUsa8HK5mrQKRdLlWL5xi407XTUWZzgFOc0zxqauEILbV3lM17walwbwpgbi2+pkUk9x/i1jKcv96CSfg=</latexit>

hn+2

<latexit sha1_base64="60Pu6J85y2/Q3g871xsYtN9X7A=">ACynicjVHLSsNAFD2Nr1pfVZdugkUQhJA+7GNXcOPCRQX7gFolSac1NC8mE6GU7vwBt/ph4h/oX3hnTEXRSckOXPuOXfm3mtHnhsL03zPaCura+sb2c3c1vbO7l5+/6AThwl3WNsJvZD3bCtmnhuwtnCFx3oRZ5Zve6xrTy5kvPvIeOyGwY2YRmzgW+PAHbmOJYjqPtzNgrPS/D5fMI2qWTQrDZ1ApVqrnhNoNOqNWlkvGqZaBaSrFebfcIshQjhI4IMhgCDswUJMTx9FmIiIG2BGHCfkqjDHDnyJqRipLCIndB3TLt+yga0lzlj5XboFI9eTk4dJ+QJScJy9N0FU9UZskuyz1TOeXdpvS301w+sQIPxP7lWyj/65O1CIxQVzW4VFOkGFmdk2ZJVFfkzfUfVQnKEBEn8ZDinLCjnIs+68oTq9plby0V/1BKycq9k2oTfMpb0oAXU9SXg07JKJaN0nWl0Kyko87iCMc4pXnW0MQlWmirKp/xglftSuPaVJt9S7VM6jnEr6U9fQEfHZIj</latexit>

rJ(hn+1)

<latexit sha1_base64="+e5Ga4VLxjGm+/9HX/IP+KNODUA=">AC1nicjVHLSsNAFD3G97vVpZtgERQhTKqt7a7gRlxVsFpoq0ziVEPTJCQTRUrdiVt/wK1+kvgH+hfeGVPQheiEJHfOPefM3HudyPcSydjbmDE+MTk1PTM7N7+wuLScy6+cJGEau6Lhn4YNx2eCN8LREN60hfNKBa87/ji1Ontq/zptYgTLwyO5W0kOn1+GXhdz+WSoPNcvh1wx+fm4ebV2SDYtodb57kCs6qVUokxk1m2vcNKZQpYsVqu2KZtMb0KyFY9zL2ijQuEcJGiD4EAkmIfHAk9LdhgiAjrYEBYTJGn8wJDzJE2JZYgBie0R9L2rUyNKC98ky02qVTfHpjUprYIE1IvJhidZqp86l2Vuhv3gPtqe52S38n8+oTKnF6F+6EfO/OlWLRBcVXYNHNUaUdW5mUuqu6Jubn6rSpJDRJiKLygfU+xq5ajPptYkunbVW67z75qpULV3M26KD3VLGvBoiubvwUnRsnes4tFuobabjXoGa1jHJs1zDzUcoI4Ged/gCc94MZrGnXFvPHxRjbFMs4ofy3j8BF8ela4=</latexit>
  • <latexit sha1_base64="q4P69e9uFKREqtygAOobf+EbD3w=">ACynicjVHLTsJAFD3UF+ILdemkZi4IgUqjx2JGxcuMJFHAsS0ZcCGoW2mUxNC3PkDbvXDjH+gf+GdsS6IDpN2zPnNn7r1uxP1YWtZ7xlhb39jcym7ndnb39g/yh0edOEyEx9peyEPRc52YcT9gbelLznqRYM7M5azrTi9VvPvAROyHwa2cR2w4cyaBP/Y9RxLVHbgJ50ze5QtWsWqVLthErCrteoFgUaj3qhVzFLR0quAdLXC/BsGCGEhwQzMASQhDkcxPT0UYKFiLghFsQJQr6OMzwiR96EVIwUDrFT+k5o10/ZgPYqZ6zdHp3C6RXkNHFGnpB0grA6zdTxRGdW7KrcC51T3W1OfzfNSNW4p7Yv3xL5X9qhaJMeq6Bp9qijSjqvPSLInuirq5+aMqSRki4hQeUVwQ9rRz2WdTe2Jdu+qto+MfWqlYtfdSbYJPdUsa8HK5mrQKRdLlWL5xi407XTUWZzgFOc0zxqauEILbV3lM17walwbwpgbi2+pkUk9x/i1jKcv96CSfg=</latexit>

The gradient algorithm proceeds by successive steps in the negative direction of the gradient of J(h).

43 / 72

slide-44
SLIDE 44

The augmented Lagrangian algorithm (I) [NoWri]

  • Let us now consider the equality-constrained problem

min

h∈H J(h) s.t. C(h) = 0,

where J : H → R and C : H → R are differentiable.

  • One possibility is to replace this problem with the unconstrained one:

min

h∈H J(h) + ℓC(h),

where J(h) is penalized by the constraint C(h), using a fixed weight ℓ > 0.

  • In practice, the “suitable” value ℓ∗ for ℓ, i.e. that driving the optimization process

to the desired level of constraint C(h) = 0, is estimated after a few trial and errors.

  • This value ℓ∗ can be interpreted as the Lagrange multiplier associated to the

constraint C(h) = 0 at the obtained local minimum.

44 / 72

slide-45
SLIDE 45

The augmented Lagrangian algorithm (II)

The augmented Lagrangian algorithm reduces the resolution of a constrained

  • ptimization problem to a series of unconstrained ones, with updated parameters.

Initialization: Start from an initial design h0, initial parameters ℓ0, b0. For n = 0, ... convergence: ❶ Solve the unconstrained optimization problem: min

h∈H J(h) + ℓnC(h) + bn

2 C(h)2, starting from hn to obtain hn+1. ❷ Update the optimization parameters via: ℓn+1 = ℓn + bnC(hn), and bn+1 = αbn if b < bmax, bn

  • therwise.
  • ℓn and bn are updated so that the constraint C(h) = 0 holds at convergence;
  • ℓn converges to the optimal Lagrange multiplier for the constraint C(h) = 0;
  • bn is a weight for the quadratic penalization of the constraint function C(h).

45 / 72

slide-46
SLIDE 46

The augmented Lagrangian algorithm (III)

The following “pragmatic” version involves fewer (costly) evaluations of J(h), C(h), and the derivatives J′(h), C ′(h). Initialization: Start from an initial design h0, initial parameters ℓ0, b0. For n = 0, ... convergence: ❶ Calculate a descent direction hn for the functional: h → L(h, ℓn, bn) := J(h) + ℓnC(h) + bn 2 C(h)2. ❷ Select a suitably small time step so that: L(hn + τ n hn, ℓn, bn) < L(hn, ℓn, bn). ❸ Update the design via: hn+1 = hn + τ n hn. ❹ Update the optimization parameters via: ℓn+1 = ℓn + bnC(hn+1), and bn+1 = αbn if b < bmax, bn

  • therwise.

46 / 72

slide-47
SLIDE 47

Part II Optimal control and parametric optimization problems

1 Parametric optimization problems 2 Numerical algorithms

A refresher about the finite element method A refresher about basic optimization methods Numerical algorithms for parametric optimization

47 / 72

slide-48
SLIDE 48

Numerical algorithms (I)

We solve the optimization problem: min

h∈Uad J(h), where J(h) =

  • D

j(uh) dx + ℓ

  • D

h dx; in there:

  • The set Uad is: Uad = {h ∈ L∞(D), α < h(x) < β a.e. x ∈ D};
  • A constraint on the high values of h is added by a fixed penalization.

A basic projected gradient algorithm then reads: Initialization: Start from an initial design h0, For n = 0, ... convergence: ❶ Calculate the state uhn and the adjoint phn at h = hn; ❷ Calculate the descent direction hn = −∇uhn · ∇phn − ℓ. ❸ Select an appropriate time step τ n > 0; ❹ Update the design as: hn+1 = min(β, max(α, hn − τ n hn)).

48 / 72

slide-49
SLIDE 49

Numerical algorithms (II)

In practice,

  • The domain D is equipped with a fixed mesh T , composed e.g. of triangles.
  • The optimized conductivity h is discretized on this mesh, e.g. as a P0 or P1

finite element function.

  • For a given value of h, the solutions uh and ph to the state and adjoint

equations are calculated by the finite element method on the mesh T .

49 / 72

slide-50
SLIDE 50

One first example: the optimal radiator (I)

We consider the problem: min

h∈Uad J(h), where J(h) =

  • D

uh dx + ℓ

  • D

h dx, the temperature uh ∈ H1

0(D) is the solution to:

−div(h∇uh) = 1 in D, uh = 0

  • n ∂D.

In other terms,

  • The mean temperature inside D is minimized;
  • A constraint on the high values of the con-

ductivity is added by a fixed penalization of the objective function.

D

50 / 72

slide-51
SLIDE 51

One first example: the optimal radiator (II)

Optimized density in the thermal radiator problem.

51 / 72

slide-52
SLIDE 52

One first example: the optimal radiator (III)

  • This oscillatory behavior is actually not surprising: the algorithm tries to reproduce

the “homogenized” behavior of solutions.

  • It is however highly undesirable in practice.
  • One remedy consists in acting on the selected descent direction, by changing inner

products, a general idea which fulfills many other purposes.

  • Other solutions are presented later in the course.

52 / 72

slide-53
SLIDE 53

Changing inner products (I)

  • By definition of the Fréchet derivative, the following expansion holds:

J(h + τ h) = J(h) + τJ′(h)( h) + o(τ), and a descent direction for J from h is any h ∈ L∞(D) such that J′(h)( h) < 0.

  • The formula for the derivative

J′(h)( h) =

  • D
  • h∇uh · ∇ph dx

makes it very natural to take as a descent direction the L2(D) gradient of J′(h):

  • h = −∇uh · ∇ph,

i.e. the gradient associated to the differential J′(h) via the L2(D) dual pairing.

  • This choice is actually awkward: ∇uh and ∇ph are not very regular, and nor is

h. In the theoretical framework, h does not even belong to L∞(D)!

  • Other, more adapted choices of a descent direction are possible, as gradients of

J′(h) obtained with other inner products than that of L2(D).

53 / 72

slide-54
SLIDE 54

Changing inner products (II)

Let H be a Hilbert space with inner product ·, ·H. Solve the following identification problem: Search for V ∈ H such that: ∀w ∈ H, V , wH = J′(h)(w) =

  • D

w∇uh · ∇ph dx. Then −V is also a descent direction for J(h), since for τ > 0 small enough: J(h − τV ) = J(h) − τJ′(h)(V ) + o(τ) = J(h) − τV , V H + o(τ) < J(h). Example: A descent direction which is more regular than that supplied by the L2(D) inner product is obtained with the choice: H = H1(D), and u, vH =

  • D

(α2∇u · ∇v + uv) dx, for α “small” (of the order of the mesh size).

54 / 72

slide-55
SLIDE 55

The optimal radiator again

Optimized density for the thermal radiator problem using the “change of inner product” trick.

55 / 72

slide-56
SLIDE 56

Another example: design of a “heat lens” (I)

As proposed in [Che], the problem min J(h) where J(h) =

  • ω
  • α∂uh

∂x1

  • 2

dx + ℓ

  • D

h dx is considered:

  • The horizontal heat flux through a non optimizable region ω is minimized;
  • A penalization on high values of the conductivity h is added.

D Γin Γout

ω

56 / 72

slide-57
SLIDE 57

Another example: design of a “heat lens” (II)

Optimized heat lens under a penalization of high values of the conductivity.

57 / 72

slide-58
SLIDE 58

Remarks

  • The above strategy to impose a constraint on the amount of high conductivity

material is very crude. Other constrained optimization algorithms may be used, such as the Augmented Lagrangian algorithm.

  • This parametric optimization framework lends itself to the use of:
  • Quasi-Newton methods, such as the Gauss-Newton or the BFGS algorithms;
  • “True” second-order algorithms, based on the Hessian of the mapping

h → J(h).

  • Density-based methods for topology optimization problems often rely on an

adaptation of this parametric framework.

58 / 72

slide-59
SLIDE 59

Technical appendix

59 / 72

slide-60
SLIDE 60

The Lax Milgram theorem

In a Hilbert space H, let a : H × H → R be a bilinear form and ℓ : H → R be a linear form such that:

  • a is continuous, i.e. there exists M > 0 such that:

∀u, v ∈ H, |a(u, v)| ≤ M||u||H||v||H.

  • a is coercive, i.e. there exists α > 0 such that:

∀u ∈ H, α||u||2

H ≤ a(u, u).

  • ℓ is continuous (i.e. ℓ belongs to the dual space H∗):

||ℓ||H∗ := sup

v∈H v=0

|ℓ(v)| ||v||H < ∞.

Theorem 3.

Under the above hypotheses, the variational problem Search for u ∈ H s.t. for all v ∈ H, a(u, v) = ℓ(v) has a unique solution u ∈ H, which depends continuously on ℓ: ||u||H ≤ M α ||ℓ||H∗.

60 / 72

slide-61
SLIDE 61

Fréchet and Gateaux derivatives

Several notions of derivative are available for a function F : U → V between two normed vector spaces (U, || · ||U) and (V , || · ||V ).

Definition 3 (Fréchet differentiability).

  • A function F : U → V is called Fréchet differentiable at some point x ∈ U if there

exists a linear, continuous mapping Lx : U → V such that: F(x + v) = F(x) + Lx(v) + o(||v||U), where ||o(||v||U)||V ||v||U

v→0

− − − → 0.

  • The mapping v → Lx(v) is denoted by v → F ′(x)(v), or dxF(v) and is called the

differential or the Fréchet derivative of F at x.

  • The function F : U → V is called Gateaux differentiable at x ∈ U if for any

direction v ∈ U, the following limit exists: lim

t→0 t>0

F(x + tv) − F(x) t . Remark: The notion of Fréchet differentiability is stronger than that of Gateaux differentiability, which is a generalization of directional differentiability.

61 / 72

slide-62
SLIDE 62

Fréchet derivatives: the “chain rule”

The chain rule is a fundamental result, which supplies the Fréchet derivative of the composite G ◦ F of two functions F : U → V and G : V → W between three normed vector spaces (U, || · ||U), (V , || · ||V ) and (W , || · ||W ).

Theorem 4 (Chain rule).

Let x ∈ U be a point such that:

  • F is Fréchet differentiable at x;
  • G is Fréchet differentiable at F(x) ∈ V .

Then, the composite function G ◦ F : U → W is Fréchet differentiable at x, and its Fréchet derivative v → (G ◦ F)′(x)(v) is the linear mapping defined by: ∀v ∈ U, (G ◦ F)′(x)(v) = G ′(F(x))(F ′(x)(v)).

62 / 72

slide-63
SLIDE 63

The implicit function theorem

The implicit function theorem is a key result, ensuring the existence and smoothness

  • f a solution u = uθ to a parametrized, non linear equation of the form:

F(θ, u) = 0, where u is the unknown and θ is a “parameter”; see [La], Chap. I, Th. 5.9.

Theorem 5 (Implicit function theorem).

Let Θ, E, F be Banach spaces, V ⊂ Θ, U ⊂ E be open sets. and F : V × U → G be a function of class Cp for p ≥ 1. Let (θ0, u0) ∈ V × U be such that F(θ0, u0) = 0 and assume that: The differential duF(θ0, u0) : F → G is a linear isomorphism. Then there exist an open subset V′ ⊂ V of θ0 in Θ and a mapping g : V′ → U of class Cp satisfying the properties: ❶ g(θ0) = u0, ❷ For all θ ∈ V′, the equation F(θ, u) = 0 has a unique solution u ∈ E, given by u = g(θ).

63 / 72

slide-64
SLIDE 64

First-order necessary optimality conditions (I)

Let H be a Hilbert space, and let J : H → R be a differentiable function; we consider the unconstrained minimization problem: min

u∈H J(u).

(UC)

Definition 4.

A point u ∈ H is a local minimizer for (UC) if there exists an open neighborhood V ⊂ H containing u such that: ∀v ∈ V , J(u) ≤ J(v).

Theorem 6.

Let u be a local minimize for (UC); then: ∇J(u) = 0.

64 / 72

slide-65
SLIDE 65

First-order necessary optimality conditions (II)

Proof: Let h ∈ H be given; by the definition of u, it holds for t > 0 small enough: J(u + th) ≥ J(u), and so J(u + th) − J(u) t ≥ 0. Letting t → 0, the differentiability of J yields: J′(u)(h) = ∇J(u), h ≥ 0. Replacing h by −h in the previous argument yields the converse inequality ∇J(u), h ≤ 0, which completes the proof. Remark The above proof uses in a cru- cial way that the point u in (UC) minimizes J(v) (locally) in any direction h ∈ H.

u

<latexit sha1_base64="7zke4gm1s3NRepBMGI/A1hOusBs=">ACxHicjVHLSsNAFD2Nr1pfVZdugkVwFZIq6LIgiMsWbC3UIsl0WofmxWQilKI/4Fa/TfwD/QvjCmoRXRCkjPn3nNm7r1BGopMue5ryVpYXFpeKa9W1tY3Nreq2zudLMkl42WhInsBn7GQxHzthIq5N1Ucj8KQn4VjM90/OqOy0wk8aWapLwf+aNYDAXzFVGt/KZacx3XLHseAWoVjNpPqCawyQgCFHBI4YinAIHxk9PXhwkRLXx5Q4SUiYOMc9KqTNKYtThk/smL4j2vUKNqa9syMmtEpIb2SlDYOSJNQniSsT7NPDfOmv3Ne2o89d0m9A8Kr4hYhVti/9LNMv+r07UoDHFqahBU2oYXR0rXHLTFX1z+0tVihxS4jQeUFwSZkY567NtNJmpXfWN/E3k6lZvWdFbo53fUsasPdznPOgU3e8I6feOq41nGLUZexhH4c0zxM0cIEm2sb7EU94ts6t0Mqs/DPVKhWaXxb1sMHX0aPag=</latexit>

h

<latexit sha1_base64="8E4HVMHENr5gP9rc06JPkHUxi2M=">ACxHicjVHLSsNAFD2Nr1pfVZdugkVwFSatre2uIjLFuwDapEknbaheZFMhFL0B9zqt4l/oH/hnTEFXRSdkOTOuecmXuvHXluIh7z2lr6xubW/ntws7u3v5B8fCom4Rp7PCOE3ph3LethHtuwDvCFR7vRzG3fNvjPXt2JfO9Bx4nbhjcinEh741Cdyx61iCoPb0vlhiRqNerTKmM8M0K6xao4CVG7W6qZsGU6uEbLXC4hvuMEIByl8cAQFHuwkNAzgAmGiLAhFoTFLkqz/GIAmlTYnFiWITO6Duh3SBDA9pLz0SpHTrFozcmpY4z0oTEiymWp+kqnypnia7yXihPebc5/e3MydUYEroX7ol8786WYvAGHVg0s1RQqR1TmZS6q6Im+u/6hKkENEmIxHlI8pdpRy2WdaRJVu+ytpfIfilRuXcybopPeUsa8HK+uqgWzbMilFuX5SaRjbqPE5winOa5yWauELHeX9jBe8ateapyVa+k3VcpnmGL+W9vQF35+Pog=</latexit>
  • <latexit sha1_base64="pXkActCnptZX5e3i8T2Zzct4Mo=">ACynicjVHLSsNAFD2Nr1pfVZdugkVwFZIq6LgxoWLCvYBtkgyndbQyYPJRCjFnT/gVj9M/AP9C+MKahFdEKSM+ec2fuvUEqwky57mvJWlhcWl4pr1bW1jc2t6rbO+0sySXjLZaIRHYDP+MijHlLhUrwbiq5HwWCd4LxmY537rjMwiS+UpOU9yN/FIfDkPmKqE4vyIXg6qZacx3XLHseAWoVjNpPqCHgZIwJAjAkcMRVjAR0bPNTy4SInrY0qcJBSaOMc9KuTNScVJ4RM7pu+IdtcFG9Ne58yMm9Epgl5JThsH5ElIJwnr02wTz01mzf6We2py6rtN6B8UuSJiFW6J/cs3U/7Xp2tRGOLU1BSTalhdHWsyJKbruib21+qUpQhJU7jAcUlYWacsz7bxpOZ2nVvfRN/M0rN6j0rtDne9S1pwN7Pc6Ddt3xjpz65XGt4RSjLmMP+zikeZ6gXM0TJVPuIJz9aFJa2JNf2UWqXCs4tvy3r4ABJUkhU=</latexit>
  • <latexit sha1_base64="pXkActCnptZX5e3i8T2Zzct4Mo=">ACynicjVHLSsNAFD2Nr1pfVZdugkVwFZIq6LgxoWLCvYBtkgyndbQyYPJRCjFnT/gVj9M/AP9C+MKahFdEKSM+ec2fuvUEqwky57mvJWlhcWl4pr1bW1jc2t6rbO+0sySXjLZaIRHYDP+MijHlLhUrwbiq5HwWCd4LxmY537rjMwiS+UpOU9yN/FIfDkPmKqE4vyIXg6qZacx3XLHseAWoVjNpPqCHgZIwJAjAkcMRVjAR0bPNTy4SInrY0qcJBSaOMc9KuTNScVJ4RM7pu+IdtcFG9Ne58yMm9Epgl5JThsH5ElIJwnr02wTz01mzf6We2py6rtN6B8UuSJiFW6J/cs3U/7Xp2tRGOLU1BSTalhdHWsyJKbruib21+qUpQhJU7jAcUlYWacsz7bxpOZ2nVvfRN/M0rN6j0rtDne9S1pwN7Pc6Ddt3xjpz65XGt4RSjLmMP+zikeZ6gXM0TJVPuIJz9aFJa2JNf2UWqXCs4tvy3r4ABJUkhU=</latexit>
  • <latexit sha1_base64="pXkActCnptZX5e3i8T2Zzct4Mo=">ACynicjVHLSsNAFD2Nr1pfVZdugkVwFZIq6LgxoWLCvYBtkgyndbQyYPJRCjFnT/gVj9M/AP9C+MKahFdEKSM+ec2fuvUEqwky57mvJWlhcWl4pr1bW1jc2t6rbO+0sySXjLZaIRHYDP+MijHlLhUrwbiq5HwWCd4LxmY537rjMwiS+UpOU9yN/FIfDkPmKqE4vyIXg6qZacx3XLHseAWoVjNpPqCHgZIwJAjAkcMRVjAR0bPNTy4SInrY0qcJBSaOMc9KuTNScVJ4RM7pu+IdtcFG9Ne58yMm9Epgl5JThsH5ElIJwnr02wTz01mzf6We2py6rtN6B8UuSJiFW6J/cs3U/7Xp2tRGOLU1BSTalhdHWsyJKbruib21+qUpQhJU7jAcUlYWacsz7bxpOZ2nVvfRN/M0rN6j0rtDne9S1pwN7Pc6Ddt3xjpz65XGt4RSjLmMP+zikeZ6gXM0TJVPuIJz9aFJa2JNf2UWqXCs4tvy3r4ABJUkhU=</latexit>

V

<latexit sha1_base64="U8zR3h3L4/YCtd+HEYUFRH7V31w=">ACxHicjVHLSsNAFD2Nr/qunQTLIKrkFTBdlcQxGUL9gG1SDKd1tA0CTMToRT9Abf6beIf6F94Z0yhLopOSHLn3HPOzL03SKNQKtf9KFgrq2vrG8XNre2d3b390sFhWyaZYLzFkigR3cCXPApj3lKhing3FdyfBHvBOMrne8ciHDJL5V05T3J/4oDoch8xVBzfZ9qew6rln2QlCrVSuZ3s5Uka+GknpHXcYIAFDhgk4YiKI/iQ9PTgwUVKWB8zwgRFoclzPGLtBmxODF8Qsf0HdGul6Mx7bWnNGpGp0T0ClLaOCVNQjxBsT7NvnMOGt0mfMeOq7Tekf5F4TQhUeCP1LN2f+V6drURiamoIqabUILo6lrtkpiv65vZCVYocUsJ0PKC8oJgZ5bzPtFIU7vurW/yn4apUb1nOTfDl74lDXg+RXt50K43rlTaV6U604+6iKOcYIzmucl6rhBAy3j/YJXvFnXVmRJK/uhWoVc4Rfy3r+Bleqj2g=</latexit>

u + th

<latexit sha1_base64="43AxcbhRIl1OfBVNh4vAwQdz2c=">ACyHicjVHLSsNAFD2Nr1pfVZdugkUQhJBUQZcFN+KqgqmFWiSZTtuheZFMlFLc+ANu9cvEP9C/8M6YglpEJyQ5c+49Z+be6yeByKRtv5aMufmFxaXycmVldW19o7q51criPGXcZXEQp23fy3gIu5KIQPeTlLuhX7Ar/zRqYpf3fI0E3F0KcJ74beIBJ9wTxJlJsfSHN4U63Zlq2XOQucAtRQrGZcfcE1eojBkCMERwRJOICHjJ4OHNhIiOtiQlxKSOg4xz0qpM0pi1OGR+yIvgPadQo2or3yzLSa0SkBvSkpTeyRJqa8lLA6zdTxXDsr9jfvifZUdxvT3y+8QmIlhsT+pZtm/lenapHo40TXIKimRDOqOla45Lor6ubml6okOSTEKdyjeEqYaeW0z6bWZLp21VtPx90pmLVnhW5Od7VLWnAzs9xzoJW3XIOrfrFUa1hFaMuYwe72Kd5HqOBMzThkrfAI57wbJwbiXFnjD9TjVKh2ca3ZTx8ALZ8kLk=</latexit>

u − th

<latexit sha1_base64="BcwZUzueB9vZSsCwSc6a07mXw8=">ACyXicjVHLSsNAFD2Nr1pfVZdugkVwY0iqoMuCG8FNBfuAWiSZTtvYNImTiViLK3/Arf6Y+Af6F94ZU1CL6IQkZ86958zce7048BNp2685Y2Z2bn4hv1hYWl5ZXSub9STKBWM1gURKLpuQkP/JDXpC8D3owFd4dewBve4FjFGzdcJH4UnstRzNtDtxf6XZ+5kqh6umdKs39ZLNmWrZc5DZwMlJCtalR8wQU6iMCQYgiOEJwABcJPS04sBET18aYOEHI13GOexRIm1IWpwyX2AF9e7RrZWxIe+WZaDWjUwJ6BSlN7JAmojxBWJ1m6niqnRX7m/dYe6q7jejvZV5DYiX6xP6lm2T+V6dqkejiSNfgU02xZlR1LHNJdVfUzc0vVUlyiIlTuENxQZhp5aTPptYkunbVW1fH3SmYtWeZbkp3tUtacDOz3FOg3rZcvat8tlBqWJlo85jC9vYpXkeoITVFEj7ys84gnPxqlxbdwad5+pRi7TbOLbMh4+AC4bkOU=</latexit>

65 / 72

slide-66
SLIDE 66

First-order necessary optimality conditions (III)

Let H be a Hilbert space, and let J : H → R and C : H → Rp be differentiable functions; we consider the equality-constrained minimization problem: min

h∈H J(h) s.t. C(h) = 0.

(EC)

Definition 5.

A point u ∈ H is a local minimizer for (EC) if there exists an open neighborhood V ⊂ H containing u such that: ∀v ∈ V s.t. C(v) = 0, J(u) ≤ J(v).

Theorem 7 (First-order necessary optimality conditions).

Let u be a local minimizer for (EC), and assume that the gradients ∇C1(u), . . . , ∇Cp(u) are linearly independent. Then there exist Lagrange multipliers λ1, . . . , λp ∈ R such that: ∇J(u) +

p

  • i=1

λi∇Ci(u) = 0.

66 / 72

slide-67
SLIDE 67

First-order necessary optimality conditions (IV)

Hint of proof:

  • The local optimality of u no longer implies that, for arbitrary h ∈ H and t small

enough, J(u + th) ≥ J(u).

  • Such an inequality can only be written with directions h in the admissible space:

K(u) := {h ∈ H, there exists ε > 0 and a curve γ : [−ε, ε] → H s.t. γ(0) = u, γ′(0) = h and C(γ(t)) = 0 for t > 0

  • .
  • K(u)

is a vector space, which rewrites, using the implicit function theorem: K(u) =

p

  • i=1

{∇Ci(u)}⊥.

u

<latexit sha1_base64="j9UxlcRE5sBiMF6MiBn5DVkP9E0=">ACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZJa0GVBEJct2AfUIsl0WofmxWQilKI/4Fa/TfwD/QvjCmoRXRCkjPn3nNm7r1+EohUOc5rwVpaXldK6XNja3tnfKu3udNM4k420WB7Hs+V7KAxHxthIq4L1Eci/0A971J+c63r3jMhVxdKWmCR+E3jgSI8E8RVQruylXnKpjlr0I3BxUkK9mXH7BNYaIwZAhBEcERTiAh5SePlw4SIgbYEacJCRMnOMeJdJmlMUpwyN2Qt8x7fo5G9Fe6ZGzeiUgF5JShtHpIkpTxLWp9kmnhlnzf7mPTOe+m5T+vu5V0iswi2xf+nmf/V6VoURjgzNQiqKTGMro7lLpnpir65/aUqRQ4JcRoPKS4JM6Oc9k2mtTUrnvrmfibydSs3rM8N8O7viUN2P05zkXQqVXdk2qtVa806vmoizjAIY5pnqdo4BJNtI3I57wbF1YgZVa2WeqVcg1+/i2rIcPYRSPcA=</latexit>

h

<latexit sha1_base64="ICWeRTu6SrkZHK5xmVd9qFL8JA=">ACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZJasN0VBHZgn1ALZJMp20wTcJkIpSiP+BWv038A/0L74xTqIuiE5LcOfecM3Pv9ZMwSKXjfOSstfWNza38dmFnd2/oHh41EnjTDeZnEYi57vpTwMIt6WgQx5LxHcm/oh7/oPVyrfeQiDeLoVs4SPph64ygYBcyTBLUm98WSU3b0speCer1WcVzbNUgJZjXj4jvuMEQMhgxTcESQFIfwkNLThwsHCWEDzAkTFAU6z/GEAmkzYnFieIQ+0HdMu75BI9orz1SrGZ0S0itIaeOMNDHxBMXqNFvnM+2s0FXec+2p7jajv2+8poRKTAj9S7dg/lenapEYoaZrCKimRCOqOmZcMt0VdXN7qSpJDglhKh5SXlDMtHLRZ1trUl276q2n85+aqVC1Z4ab4Uvdkga8mK9OuhUyu5FudKqlhpVM+o8TnCKc5rnJRq4QRNt7f2CV7xZ1ZopVb2Q7VyRnOMX8t6/gaEOI+A</latexit>

γ(t)

<latexit sha1_base64="SftfSmIrx9E16UE7brMwlu675Q=">ACzHicjVHLTsJAFD3UF+ILdemkZjghQkEXYkblwZTORhgJjpMGBDX2mnJoSw9Qfc6ncZ/0D/wjtjSXBdJq2d84958zce+3QdWJpWR8ZY219Y3Mru53b2d3bP8gfHrXjIm4aPHADaKuzWLhOr5oSUe6ohtGgnm2Kzr25ErlO08ip3Av5PTUAw8NvadkcOZJOi+P2aex4ry/CFfsEqWXuZSUK/XKlbZLKdIAelqBvl39DFEAI4EHgR8SIpdMT09FCGhZCwAWaERQ5Oi8wR460CbEMRihE/qOadLUZ/2yjPWak6nuPRGpDRxRpqAeBHF6jRT5xPtrNBV3jPtqe42pb+denmESjwS+pduwfyvTtUiMUJN1+BQTaFGVHU8dUl0V9TNzaWqJDmEhKl4SPmIYq6Viz6bWhPr2lVvmc5/aqZC1Z6n3ARf6pY04MUzdVBu1IqX5Qqt9VCo5qOosTnKJI87xEA9dokXeHl7wijfjxpDGzJj/UI1MqjnGr2U8fwM/G5KM</latexit>

rC(u)

<latexit sha1_base64="VEB1QHpnahkgrg9oBALuE6pgfE=">ACznicjVHLSsNAFD3Gd31VXboJFqFuwqS12u6EblxWsA+oRSZxWoN5kUyEUopbf8Ctfpb4B/oX3hlT0EXRCUnunHvOmbn3OrHvpZKx9wVjcWl5ZXVtvbCxubW9U9zd6Rlri7UZ+lPQcngrfC0VbetIXvTgRPHB80XumyrfRBJ6kXhlRzHYhDwUegNPZdLgvrXIXd8bjbL2fFNscSsRr1WY8xklm1XWe2UAlZpnNZt07aYXiXkqxUV3CNW0RwkSGAQAhJsQ+OlJ4+bDEhA0wISyhyN5gSkKpM2IJYjBCb2n74h2/RwNa8U6126RSf3oSUJo5IExEvoVidZup8p0VOs97oj3V3cb0d3KvgFCJO0L/0s2Y/9WpWiSGqOsaPKop1oiqzs1dMt0VdXPzR1WSHGLCVHxL+YRiVytnfTa1JtW1q95ynf/QTIWqvZtzM3yqW9KAZ1M05wedimVXrcrlSen8JB/1Gg5wiDLN8wznuEALbd3xZ7zg1WgZD8bUePymGgu5Zh+/lvH0Bc+Vkyc=</latexit>

{v, C(v) = 0}

<latexit sha1_base64="KkJadDqY6zeV1bte+/Zi8Uv+fxE=">AC43icjVHLatAFD1R+sijbdxkWShDTcGFIsavugoUAtl0mUIcByxjJGVsD5YlMRoZjPGu+5Kt/2Bbt/Cf2D9i96ZyJDszDJCEl3zj3nzNx7wyWueb895az/eDho8c7u3v7T54+O6g8P7zI0JFohulcaouwyAXsUxEV0sdi8tMiWAWxqIXTk9NvjcXKpdpcq4XmRjMgnEiRzIKNEHDyks/FiPtL+dvmX/MTmvzN+wD48xXcjzR/mpYqXLX83jLazLut95ntemgLc7bd5gdZfbVUW5ztLKNXxcIUWEAjMIJNAUxwiQ09NHRwZYQMsCVMUSZsXWGPtAWxBDECQqf0HdOuX6IJ7Y1nbtURnRLTq0jJ8Jo0KfEUxeY0ZvOFdTboJu+l9TR3W9A/L1mhGpMCL1Lt2beV2dq0Rjhva1BUk2ZRUx1UelS2K6Ym7P/qtLkBFm4ivK4ojq1z3mVlNbms3vQ1s/o9lGtTso5Jb4K+5JQ14PUW2ObhouPWm2/jUqp60ylHv4AVeoUbz7OAEH3GLnl/xg/8xC9HOF+cr863G6qzVWqOcGs53/8BIu6aLg=</latexit>
  • <latexit sha1_base64="TNtM6sisSmcHFa4XYth1KlNRE8=">ACynicjVHLSsNAFD2Nr1pfVZdugkVwVZJa0GXBjQsXFewD2iLJdFpDJw8mE6EUd/6AW/0w8Q/0L7wzpqAW0QlJzpx7zp259/qJCFLlOK8Fa2l5ZXWtuF7a2Nza3inv7rXTOJOMt1gsYtn1vZSLIOItFSjBu4nkXugL3vEn5zreueMyDeLoWk0TPgi9cRSMAuYpojp9PxOCq5tyxak6ZtmLwM1BflqxuUX9DFEDIYMITgiKMICHlJ6enDhICFugBlxklBg4hz3KJE3IxUnhUfshL5j2vVyNqK9zpkaN6NTBL2SnDaOyBOThLWp9kmnpnMmv0t98zk1Heb0t/Pc4XEKtwS+5dvrvyvT9eiMKZqSGgmhLD6OpYniUzXdE3t79UpShDQpzGQ4pLwsw4532jSc1teveib+ZpSa1XuWazO861vSgN2f41wE7VrVPanWruqVRj0fdREHOMQxzfMUDVygiZap8hFPeLYuLWlNrdmn1Crkn18W9bDBxQikhs=</latexit>

67 / 72

slide-68
SLIDE 68

First-order necessary optimality conditions (II)

  • For any h ∈ K(u), introducing a curve γ(t) with the above properties:

J(γ(t)) ≥ J(u), and so J(γ(t)) − J(u) t ≥ 0. Taking limits, it follows, ∇J(u), h ≥ 0. Since K(u) is a vector space, the same argument applies to −h, and so: ∇J(u), h = 0.

  • Hence, we have proved that

∀h ∈ K(u) ∇J(u), h = 0, that is ∇J(u) ∈ p

  • i=1

{∇Ci(u)}⊥ ⊥ .

  • Finally, using the general fact that, for arbitrary subsets A1, . . . , Ap ⊂ H,

(span {Ai, i = 1, . . . , p})⊥ =

p

  • i=1

A⊥

i ,

the desired result follows.

68 / 72

slide-69
SLIDE 69

First-order necessary optimality conditions (III)

Interpretation (when p = 1): The above optimality condition implies that:

  • Either ∇J(u) = 0, which is the necessary first-order optimality condition for u to

be an unconstrained minimizer of J(v).

  • Or λ = 0, and so,

∇C(u) = − 1 λ∇J(u).

  • “At first order”, a direction h ∈ H such

that J(u + th) < J(u) for small t > 0, has a non zero coordinate along ∇J(u): h = α∇J(u)+v, where v ⊥ ∇J(u), α < 0.

  • Alternatively, h rewrites:.

h = β∇C(u)+w, where w ⊥ ∇C(u), β = 0.

  • Hence, C(u + th) = 0, so that u + th is

not an admissible point in (EC).

u

<latexit sha1_base64="j9UxlcRE5sBiMF6MiBn5DVkP9E0=">ACxHicjVHLSsNAFD2Nr1pfVZdugkVwVZJa0GVBEJct2AfUIsl0WofmxWQilKI/4Fa/TfwD/QvjCmoRXRCkjPn3nNm7r1+EohUOc5rwVpaXldK6XNja3tnfKu3udNM4k420WB7Hs+V7KAxHxthIq4L1Eci/0A971J+c63r3jMhVxdKWmCR+E3jgSI8E8RVQruylXnKpjlr0I3BxUkK9mXH7BNYaIwZAhBEcERTiAh5SePlw4SIgbYEacJCRMnOMeJdJmlMUpwyN2Qt8x7fo5G9Fe6ZGzeiUgF5JShtHpIkpTxLWp9kmnhlnzf7mPTOe+m5T+vu5V0iswi2xf+nmf/V6VoURjgzNQiqKTGMro7lLpnpir65/aUqRQ4JcRoPKS4JM6Oc9k2mtTUrnvrmfibydSs3rM8N8O7viUN2P05zkXQqVXdk2qtVa806vmoizjAIY5pnqdo4BJNtI3I57wbF1YgZVa2WeqVcg1+/i2rIcPYRSPcA=</latexit>
  • <latexit sha1_base64="TNtM6sisSmcHFa4XYth1KlNRE8=">ACynicjVHLSsNAFD2Nr1pfVZdugkVwVZJa0GXBjQsXFewD2iLJdFpDJw8mE6EUd/6AW/0w8Q/0L7wzpqAW0QlJzpx7zp259/qJCFLlOK8Fa2l5ZXWtuF7a2Nza3inv7rXTOJOMt1gsYtn1vZSLIOItFSjBu4nkXugL3vEn5zreueMyDeLoWk0TPgi9cRSMAuYpojp9PxOCq5tyxak6ZtmLwM1BflqxuUX9DFEDIYMITgiKMICHlJ6enDhICFugBlxklBg4hz3KJE3IxUnhUfshL5j2vVyNqK9zpkaN6NTBL2SnDaOyBOThLWp9kmnpnMmv0t98zk1Heb0t/Pc4XEKtwS+5dvrvyvT9eiMKZqSGgmhLD6OpYniUzXdE3t79UpShDQpzGQ4pLwsw4532jSc1teveib+ZpSa1XuWazO861vSgN2f41wE7VrVPanWruqVRj0fdREHOMQxzfMUDVygiZap8hFPeLYuLWlNrdmn1Crkn18W9bDBxQikhs=</latexit>

{v, C(v) = 0}

<latexit sha1_base64="d1k7pLb3/eBbWD03u8+SaYpKsY=">AC4nicjVHBatAEH1WmzZ109Zpj6WwxAQcKGIVJXFcKAR86dGFOglYJkjy2l4sS2K1MgSTU2+9lV7zA7m2H1PyB+lfdHYrQ3ow7QpJs2/e7szE+WJLDTntzXnwcONR483n9Sfbj17/qKx/fK0yEoVi36cJZk6j8JCJDIVfS1Is5zJcJ5lIizaNY1+bOFUIXM0k/6MhfDeThJ5VjGoSbovEmSMRYB8vFWxa8Y93WYo+9ZzxQcjLVwdVFo8ndTsc78n3G3Tb3jtCrzDtn/Emedyu5qoVi9r/ESAETLEKDGHQApNcYIQBT0DeODICRtiSZiSNq8wBXqpC2JYgREjqj74R2gwpNaW8C6uO6ZSEXkVKhl3SZMRTFJvTmM2X1tmg67yX1tPc7ZL+UeU1J1RjSui/dCvm/+pMLRpjHNsaJNWUW8RUF1cupe2KuTm7V5Umh5wE48oryiOrXLVZ2Y1ha3d9Da0+TvLNKjZxW3xC9zSxrwaopsfXC673q+u/xoHlyUI16E6+xgxbNs40TfEAPfL+jBt8xw9n5Hxvjrf/lCdWqV5hb+Wc/0bkCGZ+Q=</latexit>

rC(u)

<latexit sha1_base64="0TdpvrvO/4N3vUx4bIWDeztG0c=">ACznicjVHLSsNAFD3GV62vqks3wSLUTUhMbdpdoRuXFawKVWSTmtoXiSTQpHi1h9wq58l/oH+hXfGFHQhOiHJnXPOTP3XjcJ/EyY5tuCtri0vLJaWiuvb2xubVd2di+yOE893vPiIE6vXJbxwI94T/gi4FdJylnoBvzSHXdk/nLC08yPo3MxTfhNyEaRP/Q9JgjqX0fMDZjeqeVHt5WqabRaVsO2dNwTKvZciwThy7YeqWYapVRbG6ceUV1xghocITgiCIoDMGT09GHBRELYDe4JSynyVZ5jhjJpc2JxYjBCx/Qd0a5foBHtpWem1B6dEtCbklLHIWli4qUy9N0lc+Vs0R/875XnvJuU/q7hVdIqMAdoX/p5sz/6mQtAkM0VQ0+1ZQoRFbnFS656oq8uf6tKkEOCWEyHlA+pdhTynmfdaXJVO2yt0zl3xVTonLvFdwcH/KWNOD5FPXfg4tjw7KN47N6tV0vRl3CPg5Qo3k6aOMUXfRUx5/wjBetq020mfbwRdUWCs0efizt8RP0xZM3</latexit>

rJ(u)

<latexit sha1_base64="ZSCiT+nR5FsnUkU/y84nFSyhYJA=">ACznicjVHLSsNAFD3GV31XboZLIJuwqRarbuCG3FVwVqhikziWIOTB8lEKEXc+gNu9bPEP9C/8M6Ygi5EJyS5c+45Z+be6cqzDXnb2PO+MTk1HRlZnZufmFxqbq8cponRbITpCoJDvzRS5VGMuODrWSZ2kmReQr2fVvD0y+eyezPEziEz1I5Uk+nF4HQZCE9Q7j4WvBDvaLYuqzXu7jcbDc4Zdz1vmzd2KeD1/d2mxzyX21VDudpJ9RXnuEKCAUiSMTQFCsI5PT04IEjJewCQ8IyikKbl7jHLGkLYkliCEJv6dunXa9EY9obz9yqAzpF0ZuRkmGDNAnxMorNaczmC+ts0N+8h9bT3G1Af7/0igjVuCH0L92I+V+dqUXjGk1bQ0g1pRYx1QWlS2G7Ym7OvlWlySElzMRXlM8oDqxy1GdmNbmt3fRW2Py7ZRrU7IOSW+D3JIGPJoi+z04rbvetls/3qm1dspRV7CGdWzSPfQwiHa6NiOP+EZL07buXPunYcvqjNWalbxYzmPn+BKky4=</latexit>

Illustration when H = R2, p = 1 and J is an affine function, whose isolines are depicted. At a local

  • ptimum u of (EC), ∇J(u) and ∇C(u) are aligned.

69 / 72

slide-70
SLIDE 70

Bibliography

70 / 72

slide-71
SLIDE 71

References I

[All] G. Allaire, Conception optimale de structures, Mathématiques & Applications, 58, Springer Verlag, Heidelberg (2006). [All2] G. Allaire, Shape optimization by the homogenization method, Springer Verlag, (2012). [All] G. Allaire, Analyse Numérique et Optimisation, Éditions de l’École Polytechnique, (2012). [Bre] H. Brezis, Functional analysis, Sobolev spaces and partial differential equations, Springer Science & Business Media, (2010). [Che] A. Cherkaev, Variational methods for structural optimization, vol. 140, Springer Science & Business Media, 2012. [ErnGue] A. Ern and J.-L. Guermond, Theory and Practice of Finite Elements, Springer, (2004). [FreyGeo] P.J. Frey and P.L. George, Mesh Generation : Application to Finite Elements, Wiley, 2nd Edition, (2008).

71 / 72

slide-72
SLIDE 72

References II

[HenPi] A. Henrot and M. Pierre, Variation et optimisation de formes, une analyse géométrique, Mathématiques et Applications 48, Springer, Heidelberg (2005). [La] S. Lang, Fundamentals of differential geometry, Springer, (1991). [NoWri] J. Nocedal and S.J. Wright, Numerical Optimization, Springer Science, (1999). [Mu] F. Murat, Contre-exemples pour divers problèmes où le contrôle intervient dans les coefficients, Annali di Matematica Pura ed Applicata, 112, 1, (1977),

  • pp. 49–68.

72 / 72