Mixing in Product Spaces Elchanan Mossel Elchanan Mossel Mixing in - - PowerPoint PPT Presentation

mixing in product spaces
SMART_READER_LITE
LIVE PREVIEW

Mixing in Product Spaces Elchanan Mossel Elchanan Mossel Mixing in - - PowerPoint PPT Presentation

Mixing in Product Spaces Elchanan Mossel Elchanan Mossel Mixing in Product Spaces Poincar e Recurrence Theorem Theorem (Poincar e, 1890) Let f : X X be a measure preserving transformation. Let E X measurable. Then P [ x E : f


slide-1
SLIDE 1

Mixing in Product Spaces

Elchanan Mossel

Elchanan Mossel Mixing in Product Spaces

slide-2
SLIDE 2

Poincar´ e Recurrence Theorem

Theorem (Poincar´ e, 1890) Let f : X → X be a measure preserving transformation. Let E ⊂ X measurable. Then P[x ∈ E : f n(x) / ∈ E, n > N(x)] = 0

Elchanan Mossel Mixing in Product Spaces

slide-3
SLIDE 3

Poincar´ e Recurrence Theorem

Theorem (Poincar´ e, 1890) Let f : X → X be a measure preserving transformation. Let E ⊂ X measurable. Then P[x ∈ E : f n(x) / ∈ E, n > N(x)] = 0 One of the first results in Ergodic Theory. Long term mixing. This talk is about short term mixing.

Elchanan Mossel Mixing in Product Spaces

slide-4
SLIDE 4

Finite Markov Chains

As a first example consider a Finite Markov chain. Let M be a k × k doubly stochastic symmetric matrix. Pick X 0 uniformly at random from 1, . . . , k. Given X i = a, let X i+1 = b with probability Ma,b.

Elchanan Mossel Mixing in Product Spaces

slide-5
SLIDE 5

Finite Markov Chains

As a first example consider a Finite Markov chain. Let M be a k × k doubly stochastic symmetric matrix. Pick X 0 uniformly at random from 1, . . . , k. Given X i = a, let X i+1 = b with probability Ma,b. Theorem (Long Term Mixing for Markov Chains) Suppose that other than 1, all eigenvalues λi of M satisfy |λi| ≤ λ < 1. Then for any two sets A, B ⊂ [k], it holds that

  • P[X 0 ∈ A, X t ∈ B] − P[A]P[B]
  • ≤ λt

Elchanan Mossel Mixing in Product Spaces

slide-6
SLIDE 6

Short Term Mixing for Markov Chains

Theorem

  • P[X 0 ∈ A, X 1 ∈ B] − P[A]P[B]
  • is upper bounded by

λ

  • P[A](1 − P[A])P[B](1 − P[B])

Shows: mixing in one step for large sets.

Elchanan Mossel Mixing in Product Spaces

slide-7
SLIDE 7

Short Term Mixing for Markov Chains

Theorem

  • P[X 0 ∈ A, X 1 ∈ B] − P[A]P[B]
  • is upper bounded by

λ

  • P[A](1 − P[A])P[B](1 − P[B])

Shows: mixing in one step for large sets. Proof: 1A = P[A]1 + f , 1B = P[B]1 + g, where f , g ⊥ 1

Elchanan Mossel Mixing in Product Spaces

slide-8
SLIDE 8

Short Term Mixing for Markov Chains

Theorem

  • P[X 0 ∈ A, X 1 ∈ B] − P[A]P[B]
  • is upper bounded by

λ

  • P[A](1 − P[A])P[B](1 − P[B])

Shows: mixing in one step for large sets. Proof: 1A = P[A]1 + f , 1B = P[B]1 + g, where f , g ⊥ 1 P[X 0 ∈ A, X 1 ∈ B] = 1 k (P[A]1 + f )tM(P[B]1 + g) = P[A]P[B] + 1 k f tMg, 1 k |f tMg| ≤ λf 2g2 = λ

  • P[A](1 − P[A])P[B](1 − P[B])

Elchanan Mossel Mixing in Product Spaces

slide-9
SLIDE 9

Short Term Mixing for Markov Chains

Theorem

  • P[X 0 ∈ A, X 1 ∈ B] − P[A]P[B]
  • is upper bounded by

λ

  • P[A](1 − P[A])P[B](1 − P[B])

Shows: mixing in one step for large sets. Proof: 1A = P[A]1 + f , 1B = P[B]1 + g, where f , g ⊥ 1 P[X 0 ∈ A, X 1 ∈ B] = 1 k (P[A]1 + f )tM(P[B]1 + g) = P[A]P[B] + 1 k f tMg, 1 k |f tMg| ≤ λf 2g2 = λ

  • P[A](1 − P[A])P[B](1 − P[B])

Also called Expander Mixing Lemma. Used a lot in computer science, e.g. in (de)randomization.

Elchanan Mossel Mixing in Product Spaces

slide-10
SLIDE 10

The tensor property

Consider (Y1, Z1), . . . , (Yn, Zn) which are drawn independently from the distribution of (X 0, X 1). Equivalently, the transition matrix from Y = (Y1, . . . , Yn) to Z = (Z1, . . . , Zn) is M⊗n.

Elchanan Mossel Mixing in Product Spaces

slide-11
SLIDE 11

The tensor property

Consider (Y1, Z1), . . . , (Yn, Zn) which are drawn independently from the distribution of (X 0, X 1). Equivalently, the transition matrix from Y = (Y1, . . . , Yn) to Z = (Z1, . . . , Zn) is M⊗n. Thm = ⇒ that for any sets A, B ⊂ [k]n:

  • P[Y ∈ A, Z ∈ B]−P[A]P[B]
  • ≤ λ
  • P[A](1 − P[A])P[B](1 − P[B])

Elchanan Mossel Mixing in Product Spaces

slide-12
SLIDE 12

The tensor property

Consider (Y1, Z1), . . . , (Yn, Zn) which are drawn independently from the distribution of (X 0, X 1). Equivalently, the transition matrix from Y = (Y1, . . . , Yn) to Z = (Z1, . . . , Zn) is M⊗n. Thm = ⇒ that for any sets A, B ⊂ [k]n:

  • P[Y ∈ A, Z ∈ B]−P[A]P[B]
  • ≤ λ
  • P[A](1 − P[A])P[B](1 − P[B])

Follows immediately from tensorization of the spectrum.

Elchanan Mossel Mixing in Product Spaces

slide-13
SLIDE 13

Log Sobolev inequalities

Entropy, Log Sobolev and hyper-contraction A similar story could be told using more sophisticated analytic

  • tools. Easier to work with Markov semi-groups Tt = e−tL.

Elchanan Mossel Mixing in Product Spaces

slide-14
SLIDE 14

Log Sobolev inequalities

Entropy, Log Sobolev and hyper-contraction A similar story could be told using more sophisticated analytic

  • tools. Easier to work with Markov semi-groups Tt = e−tL.

Entropy, Dirchelet Form Ent(f ) = E(f log f ) − Ef · log Ef E(f , g) = E(fLg) = E(gLf ) = E(g, f ) = − d

dt EfTtg

  • t=0.

Elchanan Mossel Mixing in Product Spaces

slide-15
SLIDE 15

Log Sobolev inequalities

Entropy, Log Sobolev and hyper-contraction A similar story could be told using more sophisticated analytic

  • tools. Easier to work with Markov semi-groups Tt = e−tL.

Entropy, Dirchelet Form Ent(f ) = E(f log f ) − Ef · log Ef E(f , g) = E(fLg) = E(gLf ) = E(g, f ) = − d

dt EfTtg

  • t=0.

Definition of Log-Sob p-logSob(C) ⇐ ⇒ ∀f , Ent(f p) ≤

Cp2 4(p−1)E(f p−1, f ) (p = 0, 1)

1-logSob(C) ⇐ ⇒ ∀f , Ent(f ) ≤ C

4 E(f , log f )

0-logSob(C) ⇐ ⇒ ∀f , Var(log f ) ≤ − C

2 E(f , 1/f )

Elchanan Mossel Mixing in Product Spaces

slide-16
SLIDE 16

Log Sob. Inequalities and Hyper-Contraction

Hyper-Contraction (Gross, Nelson 1960 ... ) r-logSob with constant C implies Ttf p ≤ f q, t ≥ C 4 log p − 1 q − 1, 1 < p < q < r or r′ < q < p = ⇒ |E[g(X0)f (Xt)]| = |E[gTtf | ≤ gp′Tf p ≤ gp′f q If f = 1A and g = 1B, get: P[X0 ∈ A, Xt ∈ B] ≤ 1Aq1Bp′ = P[A]1/qP[B]1/p′, Now optimize over norms to get a better bound than CS.

Elchanan Mossel Mixing in Product Spaces

slide-17
SLIDE 17

Reverse-Hyper-Contraction

Log-Sobolev and Rev. Hyper-Contraction(M-Oleszkiewicz-Sen-13) Let Tt = e−tL be a general Markov semi-group satisfying 2-Logsob with constant C or 1-Logsob inequality with constant C. Then for all q < p < 1, all positive f , g and all t ≥ C

4 log 1−q 1−p it

holds that Ttf q ≥ f p = ⇒ E[g(X0)f (Xt)] = E[gTtf ] ≥ gq′f p

Elchanan Mossel Mixing in Product Spaces

slide-18
SLIDE 18

Short-Time Implications

Theorem (M-Oleszkiewicz-Sen-13 ; Short-Time Implications) Let Tt = e−tL, where L satisfy 1 or 2-LogSob inequality with constant C. Let A, B ⊂ Ωn with P[A] ≥ ǫ and P[B] ≥ ǫ. Then: P[X(0) ∈ A, X(t) ∈ B] ≥ ǫ

2 1−e−2t/C Elchanan Mossel Mixing in Product Spaces

slide-19
SLIDE 19

Short-Time Implications

Theorem (M-Oleszkiewicz-Sen-13 ; Short-Time Implications) Let Tt = e−tL, where L satisfy 1 or 2-LogSob inequality with constant C. Let A, B ⊂ Ωn with P[A] ≥ ǫ and P[B] ≥ ǫ. Then: P[X(0) ∈ A, X(t) ∈ B] ≥ ǫ

2 1−e−2t/C

Comments

  • 1. Works for small sets too.
  • 2. Tensorizes.
  • 3. Some examples where it is (almost) tight.
  • 4. Uses in social choice analysis, queuing theory.

Elchanan Mossel Mixing in Product Spaces

slide-20
SLIDE 20

Comment: typical application MCMC

Long Time Behavior Log Sobolev inequalities play a major role in analyzing long term mixing of Markov chains, in particular in analysis of mixing times (Diaconis, Saloff-Coste etc.) Long Time Behavior The ε-total variation mixing time of a finite Markov chain is bounded by: 1 λ (log(1/π∗) + log(1/ǫ)) 1 C (log log(1/π∗) + log(1/ǫ)) for a continuous time Markov chain with spectral gap λ and 2-LogSob C.

Elchanan Mossel Mixing in Product Spaces

slide-21
SLIDE 21

Comment: typical application MCMC

Long Time Behavior Log Sobolev inequalities play a major role in analyzing long term mixing of Markov chains, in particular in analysis of mixing times (Diaconis, Saloff-Coste etc.) Long Time Behavior The ε-total variation mixing time of a finite Markov chain is bounded by: 1 λ (log(1/π∗) + log(1/ǫ)) 1 C (log log(1/π∗) + log(1/ǫ)) for a continuous time Markov chain with spectral gap λ and 2-LogSob C.

Elchanan Mossel Mixing in Product Spaces

slide-22
SLIDE 22

What are these lectures about?

High Dimensional Phenomena High dimensional mixing: mixing of product processes on product spaces Ωn with n large. Tight bounds For which processes, given measures a and b can we find precise upper/lower bounds for sup

  • P[X0 ∈ A, Xt ∈ B] : P[A] = a, P[B] = b
  • Interested in product space/processes of dimension n and

answers as n → ∞. Most important examples / techniques from probability / analysis.

Elchanan Mossel Mixing in Product Spaces

slide-23
SLIDE 23

What are these lectures about?

Mulit-step prcoesses How to bound P[X0 ∈ A0, X1 ∈ A1, . . . , Xk ∈ Ak] for processes X0, . . . , Xk? Interested in product space/processes of dimension n and answers as n → ∞. Most important examples / techniques from additive combinatorics.

Elchanan Mossel Mixing in Product Spaces

slide-24
SLIDE 24

What are these lectures about?

And more Theory that does both? Applications?

Elchanan Mossel Mixing in Product Spaces

slide-25
SLIDE 25

Today: tight bounds

Borell’s result. Open Problem: The Boolean cube. The state of affairs - partition into 3 parts or more.

Elchanan Mossel Mixing in Product Spaces

slide-26
SLIDE 26

Two Examples: Gaussian, Boolean

Correlated pairs (M-O’Donnell-Regev-Steif-Sudakov-05): Let x, y ∈ {−1, 1}n be e−t correlated: x is chosen uniformly and y is Tt correlated version. i.e. E[xiyi] = e−t for all i independently Let A, B ⊂ {−1, 1}n

1/2 with P[A] ≥ ǫ and P[B] ≥ ǫ

Then: P[x ∈ A, y ∈ B] ≥ ǫ

2 1−e−t

Easy to prove when A = B ...

Elchanan Mossel Mixing in Product Spaces

slide-27
SLIDE 27

Two Examples: Gaussian, Boolean

Correlated pairs (M-O’Donnell-Regev-Steif-Sudakov-05): Let x, y ∈ {−1, 1}n be e−t correlated: x is chosen uniformly and y is Tt correlated version. i.e. E[xiyi] = e−t for all i independently Let A, B ⊂ {−1, 1}n

1/2 with P[A] ≥ ǫ and P[B] ≥ ǫ

Then: P[x ∈ A, y ∈ B] ≥ ǫ

2 1−e−t

Easy to prove when A = B ... Gaussian Version Let x, y ∈ Rn two Gaussian vectors: x ∼ N(0, 1), y ∼ N(0, 1), E[xiyj] = e−tδi,j Let A, B ⊂ Rn with P[A] ≥ ǫ and P[B] ≥ ǫ Then: P[x ∈ A, y ∈ B] ≥ ǫ

2 1−e−t Elchanan Mossel Mixing in Product Spaces

slide-28
SLIDE 28

Borell’s Result and Open Problems

Borell (85): In Gaussian case the maximum and minimum of P[x ∈ A, y ∈ B] as a function of P[A] and P[B] is obtained for parallel half-spaces.

Elchanan Mossel Mixing in Product Spaces

slide-29
SLIDE 29

Borell’s Result and Open Problems

Borell (85): In Gaussian case the maximum and minimum of P[x ∈ A, y ∈ B] as a function of P[A] and P[B] is obtained for parallel half-spaces. Do not know what is the optimum in {−1, 1}n. In particular:

Elchanan Mossel Mixing in Product Spaces

slide-30
SLIDE 30

Borell’s Result and Open Problems

Borell (85): In Gaussian case the maximum and minimum of P[x ∈ A, y ∈ B] as a function of P[A] and P[B] is obtained for parallel half-spaces. Do not know what is the optimum in {−1, 1}n. In particular: Open Problem: lim

n→∞ min(P[X ∈ A, Y ∈ B] : A, B ⊂ {−1, 1}n, P[A] = P[B] = 1/4)

and similarly for max.

Elchanan Mossel Mixing in Product Spaces

slide-31
SLIDE 31

Borell’s Result and Open Problems

Borell (85): In Gaussian case the maximum and minimum of P[x ∈ A, y ∈ B] as a function of P[A] and P[B] is obtained for parallel half-spaces. Do not know what is the optimum in {−1, 1}n. In particular: Open Problem: lim

n→∞ min(P[X ∈ A, Y ∈ B] : A, B ⊂ {−1, 1}n, P[A] = P[B] = 1/4)

and similarly for max. Partition to 3 or more parts even in Gaussian space.

Elchanan Mossel Mixing in Product Spaces

slide-32
SLIDE 32

If there is time before the break ...

A cute proof of a special case of Borell’s result. Connections to social choice Theory.

Elchanan Mossel Mixing in Product Spaces

slide-33
SLIDE 33

Simple Example 1

Cosmic coin problem(M-O’Donnell-05): x ∈ {−1, 1}n uniform. (yi)m

1 conditionally independent given x.

Each pair (x, yi) is ρ-correlated. Problem: What is the largest P[y1 ∈ A, . . . ym ∈ A] can be?

Elchanan Mossel Mixing in Product Spaces

slide-34
SLIDE 34

Simple Example 2

(yi,j)1≤i<j≤m is an exchangeable collection of vectors in {−1, 1}n. If |I ∩ J| = 1 then yI, yJ are −1/3 correlated. Otherwise independent. Why? If n voters rank alternatives uniformly at random, the pairwise preferences between alternatives will be given by the collection y.

Elchanan Mossel Mixing in Product Spaces

slide-35
SLIDE 35

Full support finite Ω using hyper-contraction

Thm: More General Reverse Hypercontractivity Theorem (M-Oleszkiewicz-Sen-13) Let a the measure Ψ over a finite Ωk satisfy minx1,...,xk∈Ω Pr[X1 = x, . . . , Xk = xk] = α > 0 and have equal marginals.

Elchanan Mossel Mixing in Product Spaces

slide-36
SLIDE 36

Full support finite Ω using hyper-contraction

Thm: More General Reverse Hypercontractivity Theorem (M-Oleszkiewicz-Sen-13) Let a the measure Ψ over a finite Ωk satisfy minx1,...,xk∈Ω Pr[X1 = x, . . . , Xk = xk] = α > 0 and have equal marginals. Consider the distribution Ψn and let A1, . . . , Ak ⊆ Ωn, µ(Ai) ≥ µ. Then: Pr[X1 ∈ A1, . . . Xk ∈ Ak] ≥ µO( 1

α) ,

where (X1(i), . . . , Xk(i)) are i.i.d. according to Ψ. Note This is a key tool of analyzing the examples above as well as many

  • thers.

Elchanan Mossel Mixing in Product Spaces

slide-37
SLIDE 37

Notation

X X 1 X 2 . . . X i . . . X n X (1) X (1)

1

X (1)

2

· · · X (1)

i

· · · X (1)

n

X (2) X (2)

1

X (2)

2

· · · X (2)

i

· · · X (2)

n

. . . . . . . . . . . . . . . X (j) X (j)

1

X (j)

2

· · · X (j)

i

· · · X (j)

n

. . . . . . . . . . . . . . . X (ℓ) X (ℓ)

1

X (ℓ)

2

· · · X (ℓ)

i

· · · X (ℓ)

n

Tuples X i are i.i.d. according to P. The marginals of P are πj. Vectors X (j) are distributed according to πj := πn

j .

Distributed according to P := Pn.

Elchanan Mossel Mixing in Product Spaces

slide-38
SLIDE 38

Lower Bounds

We are mostly interested in two types of lower bounds: Set hitting: Lower bounds on P[X 1 ∈ A1, . . . , X k ∈ Ak] in terms of P[A1], . . . , P[Ak] Same set hitting: Lower bounds on P[X 1 ∈ A, . . . , X k ∈ A] in terms of P[A]. Set hitting will require something ... - e.g. X 1 = X 2 = . . . = X k.

Elchanan Mossel Mixing in Product Spaces

slide-39
SLIDE 39

Gaussian Bounds

Borell (85) k = 2 - parallel half-spaces are optimal (also Isaksson-Mossel, Neeman) By a Reverse Brascamp-Lieb inq. (Ledoux, Chen-Dafnis-Paouris 14-15) for A, . . . C ⊂ Rn: P[U ∈ A, . . . , Z ∈ C] ≥ (P[A] · · · P[C])1/(1−ρ2), where ρ is the second eigenvalue of Σ. Doesn’t require independence of coordinates

Elchanan Mossel Mixing in Product Spaces

slide-40
SLIDE 40

Full Support Case

Thm: More General Reverse Hypercontractivity Theorem (M-Oleszkiewicz-Sen-13) Let a the measure Ψ over a finite Ωk satisfy minx1,...,xk∈Ω Pr[X1 = x, . . . , Xk = xk] = α > 0 and have equal marginals.

Elchanan Mossel Mixing in Product Spaces

slide-41
SLIDE 41

Full Support Case

Thm: More General Reverse Hypercontractivity Theorem (M-Oleszkiewicz-Sen-13) Let a the measure Ψ over a finite Ωk satisfy minx1,...,xk∈Ω Pr[X1 = x, . . . , Xk = xk] = α > 0 and have equal marginals. Then: Pr[X1 ∈ A1, . . . Xk ∈ Ak] ≥ µO( 1

α) ,

where (X1(i), . . . , Xk(i)) are i.i.d. according to Ψ.

Elchanan Mossel Mixing in Product Spaces

slide-42
SLIDE 42

Non full support?

What if the support of Ω is not full? Do we care?

Elchanan Mossel Mixing in Product Spaces

slide-43
SLIDE 43

Non full support?

What if the support of Ω is not full? Do we care? Maybe: This is what additive combinatorics is all about. In particular: finite cominatorics in finite field models (Green-04 ... ). Many other applications in combinatorics and computer science.

Elchanan Mossel Mixing in Product Spaces

slide-44
SLIDE 44

Additive combinatorics perspective

Example: Theorem (Finite Field Roth Theorem) Y , R be chosen uniformly at random at F n

3 .

Then for every µ > 0 there exists c(µ) > 0, N(µ) such that if n ≥ N(µ) and A ⊂ F n

3 satisfies P[A] ≥ µ, then:

P[Y ∈ A, Y + R ∈ A, Y + 2R ∈ A] ≥ c(µ).

Elchanan Mossel Mixing in Product Spaces

slide-45
SLIDE 45

Additive combinatorics perspective

Example: Theorem (Finite Field Roth Theorem) Y , R be chosen uniformly at random at F n

3 .

Then for every µ > 0 there exists c(µ) > 0, N(µ) such that if n ≥ N(µ) and A ⊂ F n

3 satisfies P[A] ≥ µ, then:

P[Y ∈ A, Y + R ∈ A, Y + 2R ∈ A] ≥ c(µ). Why is this true?

Elchanan Mossel Mixing in Product Spaces

slide-46
SLIDE 46

Fourier Obstructions

Theorem (Finite Field Roth Theorem - Analysis) Let Y , R be chosen uniformly at random at F n

3 . Let A, B, C ⊂ F n 3

then |P[Y ∈ A, Y + R ∈ B, Y + 2R ∈ C] − P[A]P[B]P[C]| ≤ ˆ A∞

Elchanan Mossel Mixing in Product Spaces

slide-47
SLIDE 47

Fourier Obstructions

Theorem (Finite Field Roth Theorem - Analysis) Let Y , R be chosen uniformly at random at F n

3 . Let A, B, C ⊂ F n 3

then |P[Y ∈ A, Y + R ∈ B, Y + 2R ∈ C] − P[A]P[B]P[C]| ≤ ˆ A∞ Only obstruction to uniformity is linear structure

Elchanan Mossel Mixing in Product Spaces

slide-48
SLIDE 48

Fourier Obstructions

Theorem (Finite Field Roth Theorem - Analysis) Let Y , R be chosen uniformly at random at F n

3 . Let A, B, C ⊂ F n 3

then |P[Y ∈ A, Y + R ∈ B, Y + 2R ∈ C] − P[A]P[B]P[C]| ≤ ˆ A∞ Only obstruction to uniformity is linear structure If A = B = C, high Fourier coefficient = ⇒ can restrict to linear subspace with higher denisty

Elchanan Mossel Mixing in Product Spaces

slide-49
SLIDE 49

Fourier Obstructions

Theorem (Finite Field Roth Theorem - Analysis) Let Y , R be chosen uniformly at random at F n

3 . Let A, B, C ⊂ F n 3

then |P[Y ∈ A, Y + R ∈ B, Y + 2R ∈ C] − P[A]P[B]P[C]| ≤ ˆ A∞ Only obstruction to uniformity is linear structure If A = B = C, high Fourier coefficient = ⇒ can restrict to linear subspace with higher denisty Density increase arguments ...

Elchanan Mossel Mixing in Product Spaces

slide-50
SLIDE 50

Higher Order Arithmetic Obstructions

Furstenberg-Weiss (80s): For longer arithmetic progressions,

  • bstructions other than Fourier.

Gowers: Obstructions can be identified using the Gowers norms. Again - use obstruction to your benefit. Thm: (Gowers 08; Rodel and Skokan 04,06):

If q is prime and ℓ ≤ q then for every µ > 0 there exists c(µ) > 0, N(µ) such that if n ≥ N(µ) and A ⊂ F n

q satisfies P[A] ≥ µ, then:

P[Y ∈ A, Y + R ∈ A, . . . , Y + (ℓ − 1)R ∈ A] ≥ c(µ), where A, R ∈ F n

q are chosen uniformly at random.

Elchanan Mossel Mixing in Product Spaces

slide-51
SLIDE 51

Higher Order Arithmetic Obstructions

Furstenberg-Weiss (80s): For longer arithmetic progressions,

  • bstructions other than Fourier.

Gowers: Obstructions can be identified using the Gowers norms. Again - use obstruction to your benefit. Thm: (Gowers 08; Rodel and Skokan 04,06):

If q is prime and ℓ ≤ q then for every µ > 0 there exists c(µ) > 0, N(µ) such that if n ≥ N(µ) and A ⊂ F n

q satisfies P[A] ≥ µ, then:

P[Y ∈ A, Y + R ∈ A, . . . , Y + (ℓ − 1)R ∈ A] ≥ c(µ), where A, R ∈ F n

q are chosen uniformly at random.

Question: Is the additive structure necessary?

Elchanan Mossel Mixing in Product Spaces

slide-52
SLIDE 52

Obstruction to Chaos

Consider the support of Ω as a graph G with vertex V = all atoms with non-zero weight and edges between any two atoms that differ in one coordinate. We say that ρ < 1 if the graph G is connected. More formally: Definition ρ(P, S, T) := sup

  • Cov[f (X (S)), g(X (T))]
  • f : Ω(S) → R, g : Ω(T) → R,

Var[f (X (S))] = Var[g(X (T))] = 1

  • .

The correlation of P is ρ(P) := maxj∈[ℓ] ρ (P, {j}, [ℓ] \ {j}).

Elchanan Mossel Mixing in Product Spaces

slide-53
SLIDE 53

The quest for a unifying theory

Is there one theory that explains both the noisy examples and the additive theory?

Elchanan Mossel Mixing in Product Spaces

slide-54
SLIDE 54

Example

Let X be uniform in F n

3 .

Let Yi = Xi or Xi + 1 with probability 1/2 independently for each coordinate. Theorem = ⇒ P[X ∈ A, Y ∈ a] ≥ c(P[A]). Motivation from understanding “parallel repetition”. Does not follow from hyper-contraction nor does it follow from additive techniques ...

Elchanan Mossel Mixing in Product Spaces

slide-55
SLIDE 55

A General Result

Theorem (+ Hazla, Holenstein) Suppose (X, Y ) is distributed in a finite Ω2 such that: α = minaP[X = Y = a] > 0. P[X = a] = P[Y = a] for all a. Then for any set A ⊂ Ωn with PX ⊗n[A] = PY ⊗n[A] ≥ µ it holds that P[X ∈ A, Y ∈ A] ≥ c(α, µ) > 0 Our c is pretty bad: c = 1/ exp(exp(exp(1/(µ)D))), D = D(α)

Elchanan Mossel Mixing in Product Spaces

slide-56
SLIDE 56

A General Result

Theorem (+ Hazla, Holenstein) Suppose (X, Y ) is distributed in a finite Ω2 such that: α = minaP[X = Y = a] > 0. P[X = a] = P[Y = a] for all a. Then for any set A ⊂ Ωn with PX ⊗n[A] = PY ⊗n[A] ≥ µ it holds that P[X ∈ A, Y ∈ A] ≥ c(α, µ) > 0 Our c is pretty bad: c = 1/ exp(exp(exp(1/(µ)D))), D = D(α) Related to the fact that the proof is interesting:

1

Lose in “Regularity Lemma” type arguments.

2

Lose in “Invariance” transforming the problem to a Gaussian problem.

Elchanan Mossel Mixing in Product Spaces

slide-57
SLIDE 57

A Markov Chain Theorem and a general process theorem

Theorem[+Hazla, Holenstein] Xi, Yi, Zi, . . . , Wi be a Markov chain over Ω with minx∈Ω Pr[Xi = Yi = Zi = . . . Wi = x] = β > 0 and uniform marginals. Let A ⊆ Ωn, µ(A) = µ > 0. Pr[X ∈ A ∧ Y ∈ A ∧ Z ∈ A, . . . , ∧W ∈ A] ≥ f (µ, β) > 0 . Theorem[+Hazla, Holenstein] Xi, Yi, Zi, . . . , Wi be distributed over Ωk with minx∈Ω Pr[Xi = Yi = Zi = . . . Wi = x] = β > 0 and uniform

  • marginals. Suppose further that ρ(Xi, Yi, . . . , Wi) < 1. Let

A ⊆ Ωn, µ(A) = µ > 0. Pr[X ∈ A ∧ Y ∈ A ∧ Z ∈ A, . . . , ∧W ∈ A] ≥ f (µ, β) > 0 .

Elchanan Mossel Mixing in Product Spaces

slide-58
SLIDE 58

The condition ρ < 1

Weaker than full support.

Elchanan Mossel Mixing in Product Spaces

slide-59
SLIDE 59

The condition ρ < 1

Weaker than full support. Does not hold in arithmetic setups.

Elchanan Mossel Mixing in Product Spaces

slide-60
SLIDE 60

The condition ρ < 1

Weaker than full support. Does not hold in arithmetic setups. ρ = 1 iff the support of Ψ is connected with respect to changing one coordinate at a time.

Elchanan Mossel Mixing in Product Spaces

slide-61
SLIDE 61

The condition ρ < 1

Weaker than full support. Does not hold in arithmetic setups. ρ = 1 iff the support of Ψ is connected with respect to changing one coordinate at a time. Example: (x, y) ∈ F 2

3 where y = x, x + 1 has ρ < 1 but not

full support.

Elchanan Mossel Mixing in Product Spaces

slide-62
SLIDE 62

Open Problems

Still searching for unified theory. Concrete Example: Suppose Ψ is uniform over {(0, 0, 0), (1, 1, 1), (2, 2, 2), (0, 1, 2), (1, 2, 0), (2, 0, 1)} ρ = 1 but not arithmetic. Do not understand.

Elchanan Mossel Mixing in Product Spaces

slide-63
SLIDE 63

Questions??

Elchanan Mossel Mixing in Product Spaces

slide-64
SLIDE 64

Questions??

Thank you!

Elchanan Mossel Mixing in Product Spaces