Entropy production in viscous fluid flows Armen Shirikyan - - PowerPoint PPT Presentation

entropy production in viscous fluid flows
SMART_READER_LITE
LIVE PREVIEW

Entropy production in viscous fluid flows Armen Shirikyan - - PowerPoint PPT Presentation

General philosophy References Randomly forced NavierStokes equations Entropy production in viscous fluid flows Armen Shirikyan Coauthors: V. Jaki c, V. Nersesyan, C.-A. Pillet Quantissima in the Serenissima III Venice, Italy August


slide-1
SLIDE 1

General philosophy References Randomly forced Navier–Stokes equations

Entropy production in viscous fluid flows

Armen Shirikyan Coauthors: V. Jakši´ c, V. Nersesyan, C.-A. Pillet

Quantissima in the Serenissima III Venice, Italy August 19-23, 2019

1 / 15

slide-2
SLIDE 2

General philosophy References Randomly forced Navier–Stokes equations

Outline

General philosophy Finite state Markov processes Summary of the programme References Randomly forced Navier–Stokes equations Large deviation principle Lagrangian formulation, LDP and entropy production

2 / 15

slide-3
SLIDE 3

General philosophy References Randomly forced Navier–Stokes equations

Perron–Frobenius theorem

Let ℓ ≥ 2 be an integer and let A = [ [1, ℓ] ]. We consider a stochastic matrix P = (pij)1≤i,j≤ℓ, that is, a matrix whose entries pij are non-negative and

j pij = 1 for any i ∈ A.

  • Recurrence. There are integers m ≥ 1 and j0 ∈ A such that

p(m)

ij0

> 0 for any i ∈ A, where p(m)

ij

are the entries of Pm.

Theorem

If the recurrence hypothesis holds, then P∗ has exactly one eigenvector µ ∈ Rℓ

+ with eigenvalue equal to 1, and all other

eigenvalues are in the interior of the unit disc centred at zero. In other words, the Markov process in A with the transition matrix P has a unique stationary distribution µ. We shall denote by P ∈ P(AZ) the canonical process and P ∈ P(AZ) its time

  • reversal. We wish to compare P and P.

3 / 15

slide-4
SLIDE 4

General philosophy References Randomly forced Navier–Stokes equations

Equivalence

  • Symmetry. If pij > 0 for a pair i, j ∈ A, then pji > 0.

Let us define the spaces Ωt = A[

[0,t] ] = {ut = (u0, . . . , ut) : uj ∈ A for 0 ≤ j ≤ t},

endowed with the (finite) algebra of all subsets and denote by Pt and Pt the projections of P and P to Ωt. Under the symmetry hypothesis, the measures Pt and Pt are equivalent, and we write ∆t := dPt dPt , ∆t(ut) = µu0 µut exp

  • t
  • l=1

σ(ul−1, ul)

  • ,

where σ(u, v) stands for the entropy production: σ(u, v) =    log puv pvu if puv > 0, − ∞ if puv = 0. (1)

4 / 15

slide-5
SLIDE 5

General philosophy References Randomly forced Navier–Stokes equations

Reversibility

Theorem

Either the measures P and P coincide, or they are singular. The former is true if and only if one of the following properties holds. Zero mean entropy production: ¯ σ :=

  • A×A

σ(a0, a1) dP1(a0, a1) = 0. Detailed balance: µipij = µjpji for all i, j ∈ A. In this situation, the forward and backward evolutions are indistinguishable from the probabilistic point of view. Our aim is to study the case when this is not true.

5 / 15

slide-6
SLIDE 6

General philosophy References Randomly forced Navier–Stokes equations

Large deviation principle

  • Irreducibility. There is m ≥ 1 such that p(m)

ij

> 0 for all i, j ∈ A. Let us consider the empirical measures νt = t−1

t−1

  • k=0

δuk, t ≥ 1, where uk = (uj)j≥k ∈ A := AZ+ is a (random) trajectory of a Markov process with transition matrix P.

Theorem

The sequence {νt} considered under the law P satisfies the large deviation principle (LDP). That is, there is a l.s.c. function I : P(A) → [0, +∞] such that, for any Γ ∈ B(P(A)), −I(˙ Γ) ≤ lim inf

t→∞

1 t log P{νt ∈ Γ} ≤ lim sup

t→∞

1 t log P{νt ∈ Γ} ≤ −I(Γ), where Γ/ ˙ Γ is the closure/interior of Γ ∈ B(P(A)), I(A) = infA I.

6 / 15

slide-7
SLIDE 7

General philosophy References Randomly forced Navier–Stokes equations

Level-3 symmetry

The rate function I is infinite on measures not invariant under the shift and has the following form on stationary measures: I(λ) =

  • AZ− Ent
  • λ(u ; ·) | P(u0, ·)
  • λ(du),

where P(u, v) = puv, and λ(u) is the conditional measure given u = (uj)j≤0 ∈ AZ−. Let us introduce the time reversal θ : Ps(A) → Ps(A) by the following rule: θ(λ) is the unique measure on A whose projection to Ωt coincide ¯ λt for any t ≥ 1. Positivity of transitions. All the entries of P are positive.

Theorem

The rate function I satisfies the level-3 fluctuation relation: I(θ(λ)) = I(λ) + ep(λ) for any λ ∈ Ps(A), (2) where ep(λ) = σ, λ.

7 / 15

slide-8
SLIDE 8

General philosophy References Randomly forced Navier–Stokes equations

LDP for the EP and the Gallavotti–Cohen symmetry

Let us define the entropy production over the interval [0, t] by σt(u) =

t

  • l=1

σ(ul−1, ul), u = (ul)l≥0 ∈ A. (3)

Theorem

Under the hypotheses of positivity of transitions, the time averages {t−1σt} of the entropy production satisfy the LDP: t−1 log P{t−1σt ∈ Γ} ∼ − inf

r∈Γ I(r)

for any Γ ∈ B(R). (4) Moreover, the corresponding rate function I is convex on R, is finite on a symmetric interval around zero and satisfies the Gallavotti–Cohen fluctuation relation I(−r) = I(r) + r for r ∈ R. (5)

8 / 15

slide-9
SLIDE 9

General philosophy References Randomly forced Navier–Stokes equations

Summary

Summarising the above discussion, we can single out the following questions arising in the study of entropic fluctuations: (a) LDP for the occupation measures of trajectories. (b) Level-3 fluctuation relation for the rate function. (c) Well-posedness of the entropy production and its relation with the physical notion of transport. (d) Law of large numbers for the time average of the entropy production. (e) Strict positivity of the mean entropy production. (f) Local and global LDP for the time average of the entropy production. Each of these questions may be a difficult problem and most of them can be studied independently.

9 / 15

slide-10
SLIDE 10

General philosophy References Randomly forced Navier–Stokes equations

Some references

Evans–Searles (1994), Gallavotti–Cohen (1995): Fluctuation relation in numerical experiments and deterministic systems. Kurchan (1998), Lebowitz–Spohn (1999): GC relation for some classes of stochastic dynamics. Maes (1999): Interpretation of GC as a Gibbs property. Eckmann–Pillet–Rey-Bellet (1999), Rey-Bellet–Thomas (2002): GC in chains of anharmonic oscillators. Ruelle (1999), Gaspard (2005), Rondoni–Mejia Monasterio (2007), Chetrite–Gawe ¸dzki (2008), Jakši´ c–Pillet–Rey-Bellet (2011): Review papers on various aspects of fluctuation relation Baiesi–Maes (2005): Interpretation of the Navier–Stokes equations as a heat conduction network Bodineau–Lefevere (2008), Barato–Chetrite (2015), Cuneo–Jakši´ c–Pillet–AS (2017): Level-3 fluctuation relation Jakši´ c–Nersesyan–Pillet–AS (2015, 2018): Randomly forced PDEs

10 / 15

slide-11
SLIDE 11

General philosophy References Randomly forced Navier–Stokes equations

Navier–Stokes equations with smooth forcing

Let us consider the Navier–Stokes system on T2: ∂tu + u, ∇u − ν∆u + ∇p = η(t, x), div u = 0. (6) The noise is assumed to be smooth in x, while its dependence

  • n time should be such that the family of solutions of (6) form a

Markov process. Under some non-degeneracy hypotheses, the latter has a unique stationary measure, and our goal is to study entropic fluctuations for the corresponding stationary trajectory. The validity of the level-3 LDP was proved for various type of random perturbations, and in the discrete-time setting the rate function is given by the Donsker–Varadhan entropy formula: I(λ) =

  • H

Ent

  • λ(u, ·) | P(u0, ·)
  • λ−(du),

(7) where H = HZ−, and P(u, dv) is the time-1 transition function.

11 / 15

slide-12
SLIDE 12

General philosophy References Randomly forced Navier–Stokes equations

Lagrangian formulation

Let us consider the ODE ˙ y = u(t, y), y(t) ∈ T2, (8) where u(t, x) is a stationary (in the probabilistic sense) solution

  • f the Navier–Stokes system with random forcing:

∂tu + u, ∇u − ν∆u + ∇p = η(t, x), div u = 0. (9) We assume that η is a bounded random process, smooth in both variables and piecewise independent: η(t, x) =

  • k=1

I[k−1,k)(t)ηk(t − k + 1, x), (10) where {ηk} are i.i.d. random variables in L2([0, 1] × T2). We assume in addition that the mean values in x is zero.

12 / 15

slide-13
SLIDE 13

General philosophy References Randomly forced Navier–Stokes equations

Decomposability hypothesis

We assume that the laws ℓ of ηk is decomposable: (D) There is an orthonormal basis {ej(t, x)} in L2([0, 1] × T2) that consists of smooth functions such that ηk(t, x) =

  • j=1

bjξjkej(t, x) (11) where {bj} is sequence of non-zero numbers going to zero sufficiently fast, ξjk are independent random variables whose laws are smooth, supported by [−1, 1], and positive

  • n some interval [−δ, δ].

This hypotheses ensures that the Navier–Stokes system (9) has a unique stationary distribution µ. In the following theorem, we fix an arbitrary stationary solution u(t, x) for (9).

13 / 15

slide-14
SLIDE 14

General philosophy References Randomly forced Navier–Stokes equations

Large deviation principle

Theorem

There is a T2-valued random process {zt, t ≥ 0} whose almost every trajectory satisfies (8), and the following assertions hold.

  • Stationarity. The process {zt} is stationary.
  • Convergence. Let p ∈ T2 be an arbitrary initial point and let

yt(p) be the corresponding trajectory of (8). Then, for any s ≥ 1, the law of (yt(p), . . . , yt+s(p)) converges exponentially fast in the total variation norm, as t → ∞, to that of (z0, . . . , zs). Large deviations. For any p ∈ T2, the empirical measures νp

t = 1

t

t−1

  • k=0

δyk(p), yk(p) = (yl(p), l ≥ p), satisfy LDP with some good rate function I : P(TZ+) → [0, +∞].

14 / 15

slide-15
SLIDE 15

General philosophy References Randomly forced Navier–Stokes equations

Entropy production

We now assume that η(t, x) = a

  • j=1

bjξjkej(t, x), |a| ≥ 1.

Theorem

For any t ≥ 1, the law of (z1, . . . , zt) has a smooth density ρt(x1, . . . , xt) with respect to the Lebesgue measure on T2t. Moreover, there is a0 > 0 such that, for |a| ≥ a0, the density ρt is strictly positive, and the entropy production σt(xt) = log ρt(x1, . . . , xt) ρt(xt, . . . , x1), xt := (x1, . . . , xt), (12) satisfies the following inequality for any t ≥ 1: −C ≤ t−1σ(xt) ≤ C for all xt ∈ T2t.

15 / 15