Decompositions of log-correlated fields with applications Eero - - PowerPoint PPT Presentation

decompositions of log correlated fields with applications
SMART_READER_LITE
LIVE PREVIEW

Decompositions of log-correlated fields with applications Eero - - PowerPoint PPT Presentation

Decompositions of log-correlated fields with applications Eero Saksman (University of Helsinki) Based on joint work with Janne Junnila (EPFL Lausanne) and Christian Webb (Aalto University) CONFERENCE IN HONOR OF JOHN AND DON AND JOHN Seattle


slide-1
SLIDE 1

Decompositions of log-correlated fields with applications

Eero Saksman (University of Helsinki) Based on joint work with Janne Junnila (EPFL Lausanne) and Christian Webb (Aalto University) CONFERENCE IN HONOR OF JOHN AND DON AND JOHN Seattle 23/08/19

Saksman Decompositions of log-correlated fields Seattle 23/08/19 1 / 25

slide-2
SLIDE 2

GFF

  • A centered Gaussian field on a domain Ω ⊂ Rd can be be thought of

as a random function X : Ω → C such that all the evaluation vectors vectors (X(z1), . . . , X(zn)) are multivariate and centered Gaussians. Their statistical properties (for us) are determined by knowledge of the covariance function CX(z, z′) := EX(z)X(z′).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 2 / 25

slide-3
SLIDE 3

GFF

  • A centered Gaussian field on a domain Ω ⊂ Rd can be be thought of

as a random function X : Ω → C such that all the evaluation vectors vectors (X(z1), . . . , X(zn)) are multivariate and centered Gaussians. Their statistical properties (for us) are determined by knowledge of the covariance function CX(z, z′) := EX(z)X(z′).

  • Example: Consider the centered Gaussian field X on the torus

T := {|z| = 1} with the covariance structure KXci(z, z′) = ”E Xci(z)Xci(z′)” = log

  • 1

|z − z′|

  • for

z, z′ ∈ T. Such random functions ”T ∋ z → X(z)” form the Gaussian Free Field (GFF), restricted to T.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 2 / 25

slide-4
SLIDE 4

GFF

  • A centered Gaussian field on a domain Ω ⊂ Rd can be be thought of

as a random function X : Ω → C such that all the evaluation vectors vectors (X(z1), . . . , X(zn)) are multivariate and centered Gaussians. Their statistical properties (for us) are determined by knowledge of the covariance function CX(z, z′) := EX(z)X(z′).

  • Example: Consider the centered Gaussian field X on the torus

T := {|z| = 1} with the covariance structure KXci(z, z′) = ”E Xci(z)Xci(z′)” = log

  • 1

|z − z′|

  • for

z, z′ ∈ T. Such random functions ”T ∋ z → X(z)” form the Gaussian Free Field (GFF), restricted to T.

  • Xci takes values in the generalized functions – need some care!

Saksman Decompositions of log-correlated fields Seattle 23/08/19 2 / 25

slide-5
SLIDE 5

GFF

  • A centered Gaussian field on a domain Ω ⊂ Rd can be be thought of

as a random function X : Ω → C such that all the evaluation vectors vectors (X(z1), . . . , X(zn)) are multivariate and centered Gaussians. Their statistical properties (for us) are determined by knowledge of the covariance function CX(z, z′) := EX(z)X(z′).

  • Example: Consider the centered Gaussian field X on the torus

T := {|z| = 1} with the covariance structure KXci(z, z′) = ”E Xci(z)Xci(z′)” = log

  • 1

|z − z′|

  • for

z, z′ ∈ T. Such random functions ”T ∋ z → X(z)” form the Gaussian Free Field (GFF), restricted to T.

  • Xci takes values in the generalized functions – need some care!
  • Existence:

Set GFF|T(z) :=

  • n=1

1 √n

  • An cos(nθ) + Bn sin(nθ)
  • ,

Saksman Decompositions of log-correlated fields Seattle 23/08/19 2 / 25

slide-6
SLIDE 6

GFF

  • The actual GFF is defined in a domain Ω ⊂ R2, and its covariance is

given by GΩ(z, z′), where GΩ is the Green’s function of Ω ⇒ again logarithmic singularity in the covariance.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 3 / 25

slide-7
SLIDE 7

GFF

  • The actual GFF is defined in a domain Ω ⊂ R2, and its covariance is

given by GΩ(z, z′), where GΩ is the Green’s function of Ω ⇒ again logarithmic singularity in the covariance.

  • Usually one considers Greens function w.r. to zero bry values. Then

GFF(z) =

  • n≥1

Anλ−1/2

n

ϕn(z), where λn, ϕn are the Dirichlet eigenvalues and eigenfunctions in Ω.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 3 / 25

slide-8
SLIDE 8

GFF

  • The actual GFF is defined in a domain Ω ⊂ R2, and its covariance is

given by GΩ(z, z′), where GΩ is the Green’s function of Ω ⇒ again logarithmic singularity in the covariance.

  • Usually one considers Greens function w.r. to zero bry values. Then

GFF(z) =

  • n≥1

Anλ−1/2

n

ϕn(z), where λn, ϕn are the Dirichlet eigenvalues and eigenfunctions in Ω.

  • GFF appear in several connections. E.g. as scaling limits of

fluctuations of several random models of statistical physics.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 3 / 25

slide-9
SLIDE 9

Log-correlated fields

  • More generally, one sometimes needs to consider a centered Gaussian

field X say on Ω ⊂ R2 (or on a subdomain of Rd), such that KX(x, y) := E X(x)X(y) = log

  • 1

|x − y|

  • + g(x, y)

for x, y ∈ Q0, where g is continuous (often smooth). Then X is called a log-correlated

  • field. Realizations are (rather mild) generalised functions as before.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 4 / 25

slide-10
SLIDE 10

Log-correlated fields

  • More generally, one sometimes needs to consider a centered Gaussian

field X say on Ω ⊂ R2 (or on a subdomain of Rd), such that KX(x, y) := E X(x)X(y) = log

  • 1

|x − y|

  • + g(x, y)

for x, y ∈ Q0, where g is continuous (often smooth). Then X is called a log-correlated

  • field. Realizations are (rather mild) generalised functions as before.
  • Given such a covariance KX existence not difficult to prove.

Realizations are (rather mild) generalised functions as before.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 4 / 25

slide-11
SLIDE 11

Multiplicative Gaussian chaos

  • The chaos obtained from from a log-correlated field X on U is

(formally) the random measure on T with the exponential density ”eβX(z)d|z|”. Here β > 0 is a constant (inverse temperature).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 5 / 25

slide-12
SLIDE 12

Multiplicative Gaussian chaos

  • The chaos obtained from from a log-correlated field X on U is

(formally) the random measure on T with the exponential density ”eβX(z)d|z|”. Here β > 0 is a constant (inverse temperature).

  • X = ∞

k=1 X ′ k where the fields X ′ k are nice, centered and independent

Gaussian fields. Denote Xn =

n

  • k=1

X ′

k

.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 5 / 25

slide-13
SLIDE 13

Multiplicative Gaussian chaos

  • The chaos obtained from from a log-correlated field X on U is

(formally) the random measure on T with the exponential density ”eβX(z)d|z|”. Here β > 0 is a constant (inverse temperature).

  • X = ∞

k=1 X ′ k where the fields X ′ k are nice, centered and independent

Gaussian fields. Denote Xn =

n

  • k=1

X ′

k

. Then the n:th martingale approximation of the chaos is given by dµn := exp

  • βXn(x)−1

2E (βXn(x))2 dx = exp

  • βXn(x)−β2

2 KXn(x, x)

  • dx

Saksman Decompositions of log-correlated fields Seattle 23/08/19 5 / 25

slide-14
SLIDE 14

Existence of continuous chaos

dµn := exp

  • βXn(x)−1

2E (βXn(x))2 dx = exp

  • βXn(x)−β2

2 KXn(x, x)

  • dx

THEOREM (Kahane) For 0 < β < √ 2d there exists the limit µ :=

w∗

limn→∞ µn (a.s. limit).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 6 / 25

slide-15
SLIDE 15

Existence of continuous chaos

dµn := exp

  • βXn(x)−1

2E (βXn(x))2 dx = exp

  • βXn(x)−β2

2 KXn(x, x)

  • dx

THEOREM (Kahane) For 0 < β < √ 2d there exists the limit µ :=

w∗

limn→∞ µn (a.s. limit).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 6 / 25

slide-16
SLIDE 16

Existence of continuous chaos

dµn := exp

  • βXn(x)−1

2E (βXn(x))2 dx = exp

  • βXn(x)−β2

2 KXn(x, x)

  • dx

THEOREM (Kahane) For 0 < β < √ 2d there exists the limit µ :=

w∗

limn→∞ µn (a.s. limit).

  • Kahane’s original proof is based on estimating moments in L2 and on a

rather complicated iteration argument. There are now many different approaches to existence, and properties of chaos measures have been studied intensively.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 6 / 25

slide-17
SLIDE 17

Existence of continuous chaos

dµn := exp

  • βXn(x)−1

2E (βXn(x))2 dx = exp

  • βXn(x)−β2

2 KXn(x, x)

  • dx

THEOREM (Kahane) For 0 < β < √ 2d there exists the limit µ :=

w∗

limn→∞ µn (a.s. limit).

  • Kahane’s original proof is based on estimating moments in L2 and on a

rather complicated iteration argument. There are now many different approaches to existence, and properties of chaos measures have been studied intensively.

  • a quite elegant proof of existence was given by Berestycki couple of

years ago.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 6 / 25

slide-18
SLIDE 18

Where do we meet mutiplicatice chaos?

Multiplicative chaos appears e.g. as

  • as a scaling limit in random matrix theory

Saksman Decompositions of log-correlated fields Seattle 23/08/19 7 / 25

slide-19
SLIDE 19

Where do we meet mutiplicatice chaos?

Multiplicative chaos appears e.g. as

  • as a scaling limit in random matrix theory
  • a scaling limit of other models of statistical physics

Saksman Decompositions of log-correlated fields Seattle 23/08/19 7 / 25

slide-20
SLIDE 20

Where do we meet mutiplicatice chaos?

Multiplicative chaos appears e.g. as

  • as a scaling limit in random matrix theory
  • a scaling limit of other models of statistical physics
  • weldings of SLE

Saksman Decompositions of log-correlated fields Seattle 23/08/19 7 / 25

slide-21
SLIDE 21

Where do we meet mutiplicatice chaos?

Multiplicative chaos appears e.g. as

  • as a scaling limit in random matrix theory
  • a scaling limit of other models of statistical physics
  • weldings of SLE
  • probabilistic NT

Saksman Decompositions of log-correlated fields Seattle 23/08/19 7 / 25

slide-22
SLIDE 22

Where do we meet mutiplicatice chaos?

Multiplicative chaos appears e.g. as

  • as a scaling limit in random matrix theory
  • a scaling limit of other models of statistical physics
  • weldings of SLE
  • probabilistic NT
  • as a building block of probabilistic approach to Liouville

quantum gravity. ...

Saksman Decompositions of log-correlated fields Seattle 23/08/19 7 / 25

slide-23
SLIDE 23

Critical Chaos

If in our definition of chaos one sets β = √ 2d (the critical value) the limit measure is zero a.s. !

Saksman Decompositions of log-correlated fields Seattle 23/08/19 8 / 25

slide-24
SLIDE 24

Critical Chaos

If in our definition of chaos one sets β = √ 2d (the critical value) the limit measure is zero a.s. !

  • Existence of a renormalized limit for cascades proven in the important

paper by Aidekon and Shi in 2011 (partial results by Webb independently 2011).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 8 / 25

slide-25
SLIDE 25

Critical Chaos

If in our definition of chaos one sets β = √ 2d (the critical value) the limit measure is zero a.s. !

  • Existence of a renormalized limit for cascades proven in the important

paper by Aidekon and Shi in 2011 (partial results by Webb independently 2011).

  • Finally, this was extended to continuous chaos by:

THEOREM (Duplantier, Rhodes, Sheffield and Vargas 2014) For nice martingale approximations on covariance level logn and so called ∗-scale invariant log-correlated fields the following limit (called a critical chaos) exists in probability µ =

w∗

lim

n→∞

√cn µn (here the ’covariance level’ Kn(x, x) ∼ cn)

  • More difficult to handle than subcritical chaos. But it is a fundamental
  • bject!

Saksman Decompositions of log-correlated fields Seattle 23/08/19 8 / 25

slide-26
SLIDE 26

Critical Chaos

If in our definition of chaos one sets β = √ 2d (the critical value) the limit measure is zero a.s. !

  • Existence of a renormalized limit for cascades proven in the important

paper by Aidekon and Shi in 2011 (partial results by Webb independently 2011).

  • Finally, this was extended to continuous chaos by:

THEOREM (Duplantier, Rhodes, Sheffield and Vargas 2014) For nice martingale approximations on covariance level logn and so called ∗-scale invariant log-correlated fields the following limit (called a critical chaos) exists in probability µ =

w∗

lim

n→∞

√cn µn (here the ’covariance level’ Kn(x, x) ∼ cn)

  • More difficult to handle than subcritical chaos. But it is a fundamental
  • bject!
  • E.g., the support of µ can be thought in some sense to capture well

the location of ’maxima of GFF’.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 8 / 25

slide-27
SLIDE 27

∗-scale invariant fields:

  • A ∗-scale invariant field X has a ’self-similar’ covariance structure:

KX(x, y) = ∞ k(et(x − y))dt, where k is a ’nice’ rotation invariant covariance with supp (k) ⊂ B(0, 1).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 9 / 25

slide-28
SLIDE 28

∗-scale invariant fields:

  • A ∗-scale invariant field X has a ’self-similar’ covariance structure:

KX(x, y) = ∞ k(et(x − y))dt, where k is a ’nice’ rotation invariant covariance with supp (k) ⊂ B(0, 1).

  • A more visual definition is in terms of hyperbolic white noise....

Saksman Decompositions of log-correlated fields Seattle 23/08/19 9 / 25

slide-29
SLIDE 29

QUESTION 1:

Q1: How to construct critical chaos for more general log-correlated fields?

Saksman Decompositions of log-correlated fields Seattle 23/08/19 10 / 25

slide-30
SLIDE 30

Complex Gaussian chaos

  • One may also look at the dependence

β →

  • U

”eβX(x)”φ(x)dx for complex values of β.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 11 / 25

slide-31
SLIDE 31

Complex Gaussian chaos

  • One may also look at the dependence

β →

  • U

”eβX(x)”φ(x)dx for complex values of β. Again for ∗-scale invariant log-correlated fields X it is known that one obtains an analytic function in a suitable domain. This is called ’complex Gaussian chaos’.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 11 / 25

slide-32
SLIDE 32

Complex Gaussian chaos

  • One may also look at the dependence

β →

  • U

”eβX(x)”φ(x)dx for complex values of β. Again for ∗-scale invariant log-correlated fields X it is known that one obtains an analytic function in a suitable domain. This is called ’complex Gaussian chaos’.

  • This is useful e..g. in Liouville quantum gravity and in weldings of SLE.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 11 / 25

slide-33
SLIDE 33

Complex Gaussian chaos

  • One may also look at the dependence

β →

  • U

”eβX(x)”φ(x)dx for complex values of β. Again for ∗-scale invariant log-correlated fields X it is known that one obtains an analytic function in a suitable domain. This is called ’complex Gaussian chaos’.

  • This is useful e..g. in Liouville quantum gravity and in weldings of SLE.
  • Another form of ’complex Gaussian chaos’ is studied in [Lacoin,

Rhodes, Vargas 2014] (for cascades earlier by Barral and Mandelbrot). They consider 1

0 ”eaX1(x)+biX2(x)”dx, where a, b ∈ R, and X1 ⊥ X2. It is

much easier to study by the independence, but still interesting.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 11 / 25

slide-34
SLIDE 34

Complex Gaussian chaos

  • One may also look at the dependence

β →

  • U

”eβX(x)”φ(x)dx for complex values of β. Again for ∗-scale invariant log-correlated fields X it is known that one obtains an analytic function in a suitable domain. This is called ’complex Gaussian chaos’.

  • This is useful e..g. in Liouville quantum gravity and in weldings of SLE.
  • Another form of ’complex Gaussian chaos’ is studied in [Lacoin,

Rhodes, Vargas 2014] (for cascades earlier by Barral and Mandelbrot). They consider 1

0 ”eaX1(x)+biX2(x)”dx, where a, b ∈ R, and X1 ⊥ X2. It is

much easier to study by the independence, but still interesting. Still another form (’Hardy chaos’) appears as statisticals scaling limits of the Riemann zeta.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 11 / 25

slide-35
SLIDE 35

QUESTION 2:

Q2: How to construct complex chaos µβ with analytic dependence β → µβ for more general log-correlated fields?

Saksman Decompositions of log-correlated fields Seattle 23/08/19 12 / 25

slide-36
SLIDE 36

Special case: imaginary chaos and its moments

  • Imaginary chaos ”eiβX” where β > 0 exists for β ∈ (0,

√ d). It is a random generalised function, whose regularity is rather well understood.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 13 / 25

slide-37
SLIDE 37

Special case: imaginary chaos and its moments

  • Imaginary chaos ”eiβX” where β > 0 exists for β ∈ (0,

√ d). It is a random generalised function, whose regularity is rather well understood.

  • Growth of the moments

E

  • ”eiβX”, ϕ
  • m

have been studied before in the case of d = 2, CX(x, y) = log(1/|x − y|).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 13 / 25

slide-38
SLIDE 38

Special case: imaginary chaos and its moments

  • Imaginary chaos ”eiβX” where β > 0 exists for β ∈ (0,

√ d). It is a random generalised function, whose regularity is rather well understood.

  • Growth of the moments

E

  • ”eiβX”, ϕ
  • m

have been studied before in the case of d = 2, CX(x, y) = log(1/|x − y|). Very sharp results recently in [Leble&Serfaty&Zeitouni17] !

Saksman Decompositions of log-correlated fields Seattle 23/08/19 13 / 25

slide-39
SLIDE 39

Special case: imaginary chaos and its moments

  • Imaginary chaos ”eiβX” where β > 0 exists for β ∈ (0,

√ d). It is a random generalised function, whose regularity is rather well understood.

  • Growth of the moments

E

  • ”eiβX”, ϕ
  • m

have been studied before in the case of d = 2, CX(x, y) = log(1/|x − y|). Very sharp results recently in [Leble&Serfaty&Zeitouni17] ! There was a need for estimating the moments also in the case of more general log-correlated fields, especially in order to obtain uniqueness via moments.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 13 / 25

slide-40
SLIDE 40

Special case: imaginary chaos and its moments

  • Imaginary chaos ”eiβX” where β > 0 exists for β ∈ (0,

√ d). It is a random generalised function, whose regularity is rather well understood.

  • Growth of the moments

E

  • ”eiβX”, ϕ
  • m

have been studied before in the case of d = 2, CX(x, y) = log(1/|x − y|). Very sharp results recently in [Leble&Serfaty&Zeitouni17] ! There was a need for estimating the moments also in the case of more general log-correlated fields, especially in order to obtain uniqueness via moments.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 13 / 25

slide-41
SLIDE 41

Estimation of moments of the imaginary chaos

  • Note: the standard chaos (β real) has only some moments finite:

E |µ(ϕ)|p = ∞ if p ≥ 2d/β2.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 14 / 25

slide-42
SLIDE 42

Estimation of moments of the imaginary chaos

  • Note: the standard chaos (β real) has only some moments finite:

E |µ(ϕ)|p = ∞ if p ≥ 2d/β2.

  • In contrast, for imaginary chaos all moments are finite!

Saksman Decompositions of log-correlated fields Seattle 23/08/19 14 / 25

slide-43
SLIDE 43

Estimation of moments of the imaginary chaos

  • Note: the standard chaos (β real) has only some moments finite:

E |µ(ϕ)|p = ∞ if p ≥ 2d/β2.

  • In contrast, for imaginary chaos all moments are finite!
  • As a side remark: ’Hardy chaos’ has the property that the finiteness of

its moments depends on the smoothness of the test function ϕ.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 14 / 25

slide-44
SLIDE 44

Computing the n:th moment:

  • A straightforward computation yields (here ϕ is bounded and with

compact support K) M2N = E |µiβ(ϕ)|2N =

  • U2N
  • 1≤i<j≤N e−β2CX (xi,xj)

1≤i<j≤N e−β2CX (yi,yj)

  • 1≤i,j≤N e−β2CX (xi,yj)

N

  • i=1

ϕ(xi)ϕ(yi)dxidyi

  • cN
  • K 2N
  • 1≤i<j≤N e−β2CX (xi,xj)

1≤i<j≤N e−β2CX (yi,yj)

  • 1≤i,j≤N e−β2CX (xi,yj)

dxidyi.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 15 / 25

slide-45
SLIDE 45

Computing the n:th moment:

  • A straightforward computation yields (here ϕ is bounded and with

compact support K) M2N = E |µiβ(ϕ)|2N =

  • U2N
  • 1≤i<j≤N e−β2CX (xi,xj)

1≤i<j≤N e−β2CX (yi,yj)

  • 1≤i,j≤N e−β2CX (xi,yj)

N

  • i=1

ϕ(xi)ϕ(yi)dxidyi

  • cN
  • K 2N
  • 1≤i<j≤N e−β2CX (xi,xj)

1≤i<j≤N e−β2CX (yi,yj)

  • 1≤i,j≤N e−β2CX (xi,yj)

dxidyi.

  • From this is not even clear that MN < ∞, but this can be verified by

easy combinatorics (Gale-Shapley matching).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 15 / 25

slide-46
SLIDE 46

Computing the n:th moment:

  • A straightforward computation yields (here ϕ is bounded and with

compact support K) M2N = E |µiβ(ϕ)|2N =

  • U2N
  • 1≤i<j≤N e−β2CX (xi,xj)

1≤i<j≤N e−β2CX (yi,yj)

  • 1≤i,j≤N e−β2CX (xi,yj)

N

  • i=1

ϕ(xi)ϕ(yi)dxidyi

  • cN
  • K 2N
  • 1≤i<j≤N e−β2CX (xi,xj)

1≤i<j≤N e−β2CX (yi,yj)

  • 1≤i,j≤N e−β2CX (xi,yj)

dxidyi.

  • From this is not even clear that MN < ∞, but this can be verified by

easy combinatorics (Gale-Shapley matching).

  • We chose to generalise Gunson&Panta argument which works as such
  • nly in the case g ≡ 0 and d = 2.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 15 / 25

slide-47
SLIDE 47

Computing the n:th moment:

  • A straightforward computation yields (here ϕ is bounded and with

compact support K) M2N = E |µiβ(ϕ)|2N =

  • U2N
  • 1≤i<j≤N e−β2CX (xi,xj)

1≤i<j≤N e−β2CX (yi,yj)

  • 1≤i,j≤N e−β2CX (xi,yj)

N

  • i=1

ϕ(xi)ϕ(yi)dxidyi

  • cN
  • K 2N
  • 1≤i<j≤N e−β2CX (xi,xj)

1≤i<j≤N e−β2CX (yi,yj)

  • 1≤i,j≤N e−β2CX (xi,yj)

dxidyi.

  • From this is not even clear that MN < ∞, but this can be verified by

easy combinatorics (Gale-Shapley matching).

  • We chose to generalise Gunson&Panta argument which works as such
  • nly in the case g ≡ 0 and d = 2. A main ingredient is a 2-dimensional

version of Onsager’s inequality.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 15 / 25

slide-48
SLIDE 48

Using Onsager in the moment estimate

LEMMA (2-dimensional Onsager a la Gunson&Panta) Let U ⊂ R2 be a bounded domain. Then for any z1, . . . , zN ∈ U and q1, . . . , qN ∈ {1, −1} it holds that −

  • 1≤j<k≤N

qjqk log

  • 1

|zj − zk|

  • ≤ 1

2

N

  • j=1

log 1

1 2δ(zj) + cN,

where δ(zj) := mink=j |zk − zj| and C depends only on diam(U).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 16 / 25

slide-49
SLIDE 49

Using Onsager in the moment estimate

LEMMA (2-dimensional Onsager a la Gunson&Panta) Let U ⊂ R2 be a bounded domain. Then for any z1, . . . , zN ∈ U and q1, . . . , qN ∈ {1, −1} it holds that −

  • 1≤j<k≤N

qjqk log

  • 1

|zj − zk|

  • ≤ 1

2

N

  • j=1

log 1

1 2δ(zj) + cN,

where δ(zj) := mink=j |zk − zj| and C depends only on diam(U).

  • By choosing above N = 2n and

z1 = y1, . . . , zn = yn, zn+1 = x1, . . . z2n = xn (and qi = 1 if i ≤ n, qi = −1

  • therwise) one obtains in case g ≡ 0, d = 2 the bound

M2n ≤ ecN

  • B(0,1)N exp

β2 2

2n

  • j=1

log 1

1 2 mink=j |zj − zk|

  • dz1 . . . dz2n.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 16 / 25

slide-50
SLIDE 50

Using Onsager in the moment estimate

LEMMA (2-dimensional Onsager a la Gunson&Panta) Let U ⊂ R2 be a bounded domain. Then for any z1, . . . , zN ∈ U and q1, . . . , qN ∈ {1, −1} it holds that −

  • 1≤j<k≤N

qjqk log

  • 1

|zj − zk|

  • ≤ 1

2

N

  • j=1

log 1

1 2δ(zj) + cN,

where δ(zj) := mink=j |zk − zj| and C depends only on diam(U).

  • By choosing above N = 2n and

z1 = y1, . . . , zn = yn, zn+1 = x1, . . . z2n = xn (and qi = 1 if i ≤ n, qi = −1

  • therwise) one obtains in case g ≡ 0, d = 2 the bound

M2n ≤ ecN

  • B(0,1)N exp

β2 2

2n

  • j=1

log 1

1 2 mink=j |zj − zk|

  • dz1 . . . dz2n.

The rest is (somewhat non-trivial) combinatorics, and finally it follows that Mn ≤ c′nn(β2/d)n.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 16 / 25

slide-51
SLIDE 51

Original form of Onsager Lemma

Lemma Let ek ∈ {1, −1} for k = 1, . . . , N. and assume that x1, . . . xN are distinct points in R3 and denote δ(xj) = mink=j |xk − xj|. Then −

  • 1≤j<k≤N

ejek |xj − xk| ≤

  • 1≤k≤N

1 δ(xk). Proof. (Onsager 1935). To obtain a lower bound for the electric energy

  • f an assembly of such particles, let us imagine that we have on hand a

large quantity of a continuous conducting fluid. According to electrostatics, the energy released by the immersion of a charged body in such a fluid is finite; we shall call it the proper energy of the particle i (species). Without risk of confusion, this proper energy may be denoted simply by ui, for we shall have no other occasion to assign energies to individual particles. The energy of interaction between any two particles we shall denote by uik

Saksman Decompositions of log-correlated fields Seattle 23/08/19 17 / 25

slide-52
SLIDE 52

Original form of Onsager Lemma

Consider all the particles of our assembly immersed one at a time in our conducting fluid. When all the particles are immersed, the energy of the whole system equals −

  • ui

regardless of the arrangement of the particles. Now the removal of the fluid cannot release any more energy; on the contrary, some energy must in general be expended to accomplish it. Thus, however the particles may be arranged, we know a lower bound for the energy: U =

  • ik

≥ −

  • ui.
  • Saksman

Decompositions of log-correlated fields Seattle 23/08/19 18 / 25

slide-53
SLIDE 53

Original form of Onsager Lemma

Lemma Let ek ∈ {1, −1} for k = 1, . . . , N. and assume that x1, . . . xN are distinct points in R3 and denote δ(xj) = mink=j |xk − xj|. Then −

  • 1≤j<k≤N

ejek |xj − xk| ≤

  • 1≤k≤N

1 δ(xk).

  • This is applied e.g. in the theory of stability of matter (Dyson et al),

and in that context Fefferman and de la Lhave gave another approach to this kind of inequalities.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 19 / 25

slide-54
SLIDE 54

QUESTION 3:

Q3: Can one prove Onsager type inequalities for general log-normal fields? Does it work in all dimensions?

Saksman Decompositions of log-correlated fields Seattle 23/08/19 20 / 25

slide-55
SLIDE 55

QUESTION 3:

Q3: Can one prove Onsager type inequalities for general log-normal fields? Does it work in all dimensions?

  • i.e., given a log-correlated field X, one would like to prove in all

dimensions (at least locally) inequalities like −

  • 1≤j<k≤N

qjqkCX(xj, xk) ≤ 1 2

N

  • j=1

log 1

1 2δ(xk) + cN,

  • (1)

Saksman Decompositions of log-correlated fields Seattle 23/08/19 20 / 25

slide-56
SLIDE 56

QUESTION 3:

Q3: Can one prove Onsager type inequalities for general log-normal fields? Does it work in all dimensions?

  • i.e., given a log-correlated field X, one would like to prove in all

dimensions (at least locally) inequalities like −

  • 1≤j<k≤N

qjqkCX(xj, xk) ≤ 1 2

N

  • j=1

log 1

1 2δ(xk) + cN,

  • (1)
  • This question was the starting point of our work !

Saksman Decompositions of log-correlated fields Seattle 23/08/19 20 / 25

slide-57
SLIDE 57

2 simple decomposition theorems

Let X and X ′ be centred log-correlated fields on subdomain U ⊂ Rd.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 21 / 25

slide-58
SLIDE 58

2 simple decomposition theorems

Let X and X ′ be centred log-correlated fields on subdomain U ⊂ Rd. THEOREM (A) (JSW18) Assume that KX − KX ′ ∈ Hd+ε(U × U). Then, for any bounded subdomain U′ with U′ ⊂ U, there are copies of X and X ′ and a Gaussian process G which is almost surely H¨

  • lder continuous
  • n U′ so that

X ′ = X + G

  • n U′.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 21 / 25

slide-59
SLIDE 59

2 simple decomposition theorems

Let X and X ′ be centred log-correlated fields on subdomain U ⊂ Rd. THEOREM (A) (JSW18) Assume that KX − KX ′ ∈ Hd+ε(U × U). Then, for any bounded subdomain U′ with U′ ⊂ U, there are copies of X and X ′ and a Gaussian process G which is almost surely H¨

  • lder continuous
  • n U′ so that

X ′ = X + G

  • n U′.

THEOREM (B) (JSW18) Assume that KX = log(1/|x − y|) + g(x, y), where g ∈ Hd+ε(U × U). Then, given x0 ∈ U has a neighbourhood V ⊂ U so that X = Y + G

  • n V ,

where Y is an almost ∗-scale invariant field, G is almost surely H¨

  • lder

continuous and, moreover Y ⊥ G.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 21 / 25

slide-60
SLIDE 60

Consequences

Assume that X is a log-correlated field on a domain Ω ⊂ Rd with KX(x, y) = log

  • 1

|x − y|

  • + g(x, y),

where g ∈ Hd+ε

loc (U × U).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 22 / 25

slide-61
SLIDE 61

Consequences

Assume that X is a log-correlated field on a domain Ω ⊂ Rd with KX(x, y) = log

  • 1

|x − y|

  • + g(x, y),

where g ∈ Hd+ε

loc (U × U).

Then

  • Question 1 has positive answer i.e. complex chaos ” exp(βX)” is

defined for the full expected range of β ∈ C, i.e. β ∈ convo B(0, √ d) ∪ {± √ d}

  • .

Saksman Decompositions of log-correlated fields Seattle 23/08/19 22 / 25

slide-62
SLIDE 62

Consequences

Assume that X is a log-correlated field on a domain Ω ⊂ Rd with KX(x, y) = log

  • 1

|x − y|

  • + g(x, y),

where g ∈ Hd+ε

loc (U × U).

Then

  • Question 1 has positive answer i.e. complex chaos ” exp(βX)” is

defined for the full expected range of β ∈ C, i.e. β ∈ convo B(0, √ d) ∪ {± √ d}

  • .

(Consequence of Thm B)

Saksman Decompositions of log-correlated fields Seattle 23/08/19 22 / 25

slide-63
SLIDE 63

Consequences

Assume that X is a log-correlated field on a domain Ω ⊂ Rd with KX(x, y) = log

  • 1

|x − y|

  • + g(x, y),

where g ∈ Hd+ε

loc (U × U).

Then

  • Question 1 has positive answer i.e. complex chaos ” exp(βX)” is

defined for the full expected range of β ∈ C, i.e. β ∈ convo B(0, √ d) ∪ {± √ d}

  • .

(Consequence of Thm B)

  • Question 2 has positive answer i.e. critical chaos ” exp(

√ 2dX)” is well-defined.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 22 / 25

slide-64
SLIDE 64

Consequences

Assume that X is a log-correlated field on a domain Ω ⊂ Rd with KX(x, y) = log

  • 1

|x − y|

  • + g(x, y),

where g ∈ Hd+ε

loc (U × U).

Then

  • Question 1 has positive answer i.e. complex chaos ” exp(βX)” is

defined for the full expected range of β ∈ C, i.e. β ∈ convo B(0, √ d) ∪ {± √ d}

  • .

(Consequence of Thm B)

  • Question 2 has positive answer i.e. critical chaos ” exp(

√ 2dX)” is well-defined. (Consequence of Thm A)

Saksman Decompositions of log-correlated fields Seattle 23/08/19 22 / 25

slide-65
SLIDE 65

Consequences

Assume that X is a log-correlated field on a domain Ω ⊂ Rd with KX(x, y) = log

  • 1

|x − y|

  • + g(x, y),

where g ∈ Hd+ε

loc (U × U).

Then

  • Question 1 has positive answer i.e. complex chaos ” exp(βX)” is

defined for the full expected range of β ∈ C, i.e. β ∈ convo B(0, √ d) ∪ {± √ d}

  • .

(Consequence of Thm B)

  • Question 2 has positive answer i.e. critical chaos ” exp(

√ 2dX)” is well-defined. (Consequence of Thm A)

  • Question 3 has positive answer i.e. Onsager type inequalities are valid

for X locally.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 22 / 25

slide-66
SLIDE 66

Consequences

Assume that X is a log-correlated field on a domain Ω ⊂ Rd with KX(x, y) = log

  • 1

|x − y|

  • + g(x, y),

where g ∈ Hd+ε

loc (U × U).

Then

  • Question 1 has positive answer i.e. complex chaos ” exp(βX)” is

defined for the full expected range of β ∈ C, i.e. β ∈ convo B(0, √ d) ∪ {± √ d}

  • .

(Consequence of Thm B)

  • Question 2 has positive answer i.e. critical chaos ” exp(

√ 2dX)” is well-defined. (Consequence of Thm A)

  • Question 3 has positive answer i.e. Onsager type inequalities are valid

for X locally. (Consequence of Thm B)

Saksman Decompositions of log-correlated fields Seattle 23/08/19 22 / 25

slide-67
SLIDE 67

Sketch of proof of Thm A

  • Denote C = KX and C ′ = KX ′ (thought as operators, after localisation

if needed).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 23 / 25

slide-68
SLIDE 68

Sketch of proof of Thm A

  • Denote C = KX and C ′ = KX ′ (thought as operators, after localisation

if needed). We have the identity C + (C ′ − C)+ = C ′ + (C ′ − C)−, where (C ′ − C)± are the positive and negative parts of the operator C ′ − C.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 23 / 25

slide-69
SLIDE 69

Sketch of proof of Thm A

  • Denote C = KX and C ′ = KX ′ (thought as operators, after localisation

if needed). We have the identity C + (C ′ − C)+ = C ′ + (C ′ − C)−, where (C ′ − C)± are the positive and negative parts of the operator C ′ − C.

  • The above identity means that one may realise the equality of fields

X + Y + = X ′ + Y −

Saksman Decompositions of log-correlated fields Seattle 23/08/19 23 / 25

slide-70
SLIDE 70

Sketch of proof of Thm A

  • Denote C = KX and C ′ = KX ′ (thought as operators, after localisation

if needed). We have the identity C + (C ′ − C)+ = C ′ + (C ′ − C)−, where (C ′ − C)± are the positive and negative parts of the operator C ′ − C.

  • The above identity means that one may realise the equality of fields

X + Y + = X ′ + Y − ⇔ X ′ = X + (Y + − Y −), where K±Y = (C ′ − C)±.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 23 / 25

slide-71
SLIDE 71

Sketch of proof of Thm A

  • Denote C = KX and C ′ = KX ′ (thought as operators, after localisation

if needed). We have the identity C + (C ′ − C)+ = C ′ + (C ′ − C)−, where (C ′ − C)± are the positive and negative parts of the operator C ′ − C.

  • The above identity means that one may realise the equality of fields

X + Y + = X ′ + Y − ⇔ X ′ = X + (Y + − Y −), where K±Y = (C ′ − C)±.

  • It thus remains to show that Y ± are H¨
  • lder continuous. This follows

by the standard regularity theory of Gaussian processes if we verify (after localisation) that KY ± ∈ Hd+ε

loc (Rd).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 23 / 25

slide-72
SLIDE 72

Sketch of proof of Thm A

  • Denote C = KX and C ′ = KX ′ (thought as operators, after localisation

if needed). We have the identity C + (C ′ − C)+ = C ′ + (C ′ − C)−, where (C ′ − C)± are the positive and negative parts of the operator C ′ − C.

  • The above identity means that one may realise the equality of fields

X + Y + = X ′ + Y − ⇔ X ′ = X + (Y + − Y −), where K±Y = (C ′ − C)±.

  • It thus remains to show that Y ± are H¨
  • lder continuous. This follows

by the standard regularity theory of Gaussian processes if we verify (after localisation) that KY ± ∈ Hd+ε

loc (Rd).

Namely, by Sobolev imbedding then the covariances itself are H¨

  • lder

continuous.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 23 / 25

slide-73
SLIDE 73

Sketch of proof of Thm A

  • Finally, we obtain KY ± ∈ Hd+ε

loc (Rd). by proving the following result:

LEMMA Let s > 0. If T ∈ Hs(R2d) is real and symmetric kernel, then the kernel of the absolute value of the corresponding operator satisfies |T| ∈ Hs(R2d).

Saksman Decompositions of log-correlated fields Seattle 23/08/19 24 / 25

slide-74
SLIDE 74

Sketch of proof of Thm A

  • Finally, we obtain KY ± ∈ Hd+ε

loc (Rd). by proving the following result:

LEMMA Let s > 0. If T ∈ Hs(R2d) is real and symmetric kernel, then the kernel of the absolute value of the corresponding operator satisfies |T| ∈ Hs(R2d).

  • The proof of Theorem B a bit more complicated, but still not very

difficult.

Saksman Decompositions of log-correlated fields Seattle 23/08/19 24 / 25

slide-75
SLIDE 75

Congrats once more !

Saksman Decompositions of log-correlated fields Seattle 23/08/19 25 / 25