Signal description: Process or Gibbs? I. General introduction - - PowerPoint PPT Presentation

signal description process or gibbs i general introduction
SMART_READER_LITE
LIVE PREVIEW

Signal description: Process or Gibbs? I. General introduction - - PowerPoint PPT Presentation

Introduction g -measures Gibbs measures Signal description: Process or Gibbs? I. General introduction Contributors: S. Berghout (Leiden) A. van Enter (Groningen) S. Gallo (S ao Carlos), G. Maillard (Aix-Marseille), E. Verbitskiy (Leiden)


slide-1
SLIDE 1

Introduction g-measures Gibbs measures

Signal description: Process or Gibbs?

  • I. General introduction

Contributors:

  • S. Berghout (Leiden)
  • A. van Enter (Groningen)
  • S. Gallo (S˜

ao Carlos),

  • G. Maillard (Aix-Marseille),
  • E. Verbitskiy (Leiden)

Florence in May, 2017

slide-2
SLIDE 2

Introduction g-measures Gibbs measures Issues

The issue

A signal with a stochastic component is detected · · · ω−n−1 ω−n · · · ω−1 ω0 ω1 · · · ωn ωn+1 · · · ωi belongs to some finite “alphabet” A E.g. biological signals:

◮ Spike sequence of a neuron, A = {0, 1} ◮ DNA string, A = {A, C, G, T}

Basic tenets Stochastic description due to signal variability Full description = probability measure µ on AZ Key issue: efficient characterization of µ.

slide-3
SLIDE 3

Introduction g-measures Gibbs measures Issues

The issue

A signal with a stochastic component is detected · · · ω−n−1 ω−n · · · ω−1 ω0 ω1 · · · ωn ωn+1 · · · ωi belongs to some finite “alphabet” A E.g. biological signals:

◮ Spike sequence of a neuron, A = {0, 1} ◮ DNA string, A = {A, C, G, T}

Basic tenets Stochastic description due to signal variability Full description = probability measure µ on AZ Key issue: efficient characterization of µ.

slide-4
SLIDE 4

Introduction g-measures Gibbs measures Issues

The issue

A signal with a stochastic component is detected · · · ω−n−1 ω−n · · · ω−1 ω0 ω1 · · · ωn ωn+1 · · · ωi belongs to some finite “alphabet” A E.g. biological signals:

◮ Spike sequence of a neuron, A = {0, 1} ◮ DNA string, A = {A, C, G, T}

Basic tenets Stochastic description due to signal variability Full description = probability measure µ on AZ Key issue: efficient characterization of µ.

slide-5
SLIDE 5

Introduction g-measures Gibbs measures Issues

The issue

A signal with a stochastic component is detected · · · ω−n−1 ω−n · · · ω−1 ω0 ω1 · · · ωn ωn+1 · · · ωi belongs to some finite “alphabet” A E.g. biological signals:

◮ Spike sequence of a neuron, A = {0, 1} ◮ DNA string, A = {A, C, G, T}

Basic tenets Stochastic description due to signal variability Full description = probability measure µ on AZ Key issue: efficient characterization of µ.

slide-6
SLIDE 6

Introduction g-measures Gibbs measures Process aproach

First approach: Transition probabilities

Machine-learning approach:

◮ Use first part of the train to develop “rules” to predict rest ◮ By recurrence: enough to predict next bit given “history”

That is, estimate the conditional probabilities w.r.t. past P(Xn | Xn−1, Xn−2, . . .) through its law, defined by a function g such that P

  • X0 = ω0
  • X−1

−∞ = ω−1 −∞

  • = g
  • ω0
  • ω−1

−∞

  • Look for µ determined by (consistent with) this g:

µ

  • X0 = ω0
  • X−1

−∞ = ω−1 −∞

  • = g
  • ω0
  • ω−1

−∞

slide-7
SLIDE 7

Introduction g-measures Gibbs measures Process aproach

First approach: Transition probabilities

Machine-learning approach:

◮ Use first part of the train to develop “rules” to predict rest ◮ By recurrence: enough to predict next bit given “history”

That is, estimate the conditional probabilities w.r.t. past P(Xn | Xn−1, Xn−2, . . .) through its law, defined by a function g such that P

  • X0 = ω0
  • X−1

−∞ = ω−1 −∞

  • = g
  • ω0
  • ω−1

−∞

  • Look for µ determined by (consistent with) this g:

µ

  • X0 = ω0
  • X−1

−∞ = ω−1 −∞

  • = g
  • ω0
  • ω−1

−∞

slide-8
SLIDE 8

Introduction g-measures Gibbs measures Process aproach

First approach: Transition probabilities

Machine-learning approach:

◮ Use first part of the train to develop “rules” to predict rest ◮ By recurrence: enough to predict next bit given “history”

That is, estimate the conditional probabilities w.r.t. past P(Xn | Xn−1, Xn−2, . . .) through its law, defined by a function g such that P

  • X0 = ω0
  • X−1

−∞ = ω−1 −∞

  • = g
  • ω0
  • ω−1

−∞

  • Look for µ determined by (consistent with) this g:

µ

  • X0 = ω0
  • X−1

−∞ = ω−1 −∞

  • = g
  • ω0
  • ω−1

−∞

slide-9
SLIDE 9

Introduction g-measures Gibbs measures Process aproach

Regular g-measures

Relevant transitions expected to be insensitive to farther past: g is a regular g-function if ∀ǫ > 0 ∃n ≥ 0 such that sup

ω,σ

  • g
  • ω0
  • σ−n

−1 ω−n−1 −∞

  • − g
  • ω0
  • σ−1

−∞

  • < ǫ

(1)

◮ (1) is equivalent to g(ω0 | · ) continuous in product topology ◮ Additional, not very relevant, non-nullness condition

A probability measure µ is a regular g-measure if it is consistent with some regular g-function Signal µ thought as a process: past determines future (causality)

slide-10
SLIDE 10

Introduction g-measures Gibbs measures Process aproach

Regular g-measures

Relevant transitions expected to be insensitive to farther past: g is a regular g-function if ∀ǫ > 0 ∃n ≥ 0 such that sup

ω,σ

  • g
  • ω0
  • σ−n

−1 ω−n−1 −∞

  • − g
  • ω0
  • σ−1

−∞

  • < ǫ

(1)

◮ (1) is equivalent to g(ω0 | · ) continuous in product topology ◮ Additional, not very relevant, non-nullness condition

A probability measure µ is a regular g-measure if it is consistent with some regular g-function Signal µ thought as a process: past determines future (causality)

slide-11
SLIDE 11

Introduction g-measures Gibbs measures Process aproach

Regular g-measures

Relevant transitions expected to be insensitive to farther past: g is a regular g-function if ∀ǫ > 0 ∃n ≥ 0 such that sup

ω,σ

  • g
  • ω0
  • σ−n

−1 ω−n−1 −∞

  • − g
  • ω0
  • σ−1

−∞

  • < ǫ

(1)

◮ (1) is equivalent to g(ω0 | · ) continuous in product topology ◮ Additional, not very relevant, non-nullness condition

A probability measure µ is a regular g-measure if it is consistent with some regular g-function Signal µ thought as a process: past determines future (causality)

slide-12
SLIDE 12

Introduction g-measures Gibbs measures Process aproach

Regular g-measures

Relevant transitions expected to be insensitive to farther past: g is a regular g-function if ∀ǫ > 0 ∃n ≥ 0 such that sup

ω,σ

  • g
  • ω0
  • σ−n

−1 ω−n−1 −∞

  • − g
  • ω0
  • σ−1

−∞

  • < ǫ

(1)

◮ (1) is equivalent to g(ω0 | · ) continuous in product topology ◮ Additional, not very relevant, non-nullness condition

A probability measure µ is a regular g-measure if it is consistent with some regular g-function Signal µ thought as a process: past determines future (causality)

slide-13
SLIDE 13

Introduction g-measures Gibbs measures Process aproach

Regular g-measures

Relevant transitions expected to be insensitive to farther past: g is a regular g-function if ∀ǫ > 0 ∃n ≥ 0 such that sup

ω,σ

  • g
  • ω0
  • σ−n

−1 ω−n−1 −∞

  • − g
  • ω0
  • σ−1

−∞

  • < ǫ

(1)

◮ (1) is equivalent to g(ω0 | · ) continuous in product topology ◮ Additional, not very relevant, non-nullness condition

A probability measure µ is a regular g-measure if it is consistent with some regular g-function Signal µ thought as a process: past determines future (causality)

slide-14
SLIDE 14

Introduction g-measures Gibbs measures Gibbs approach

Fields point of view

If the full train is available, why use only the past? Learn to predict a bit using past and future! Xn determined by finite-window probabilities P(Xn | Xn−1, Xn−2, . . . ; Xn+1, Xn+2, . . .) through conditional laws determined by a function γ s.t. P

  • X0 = ω0
  • X{0}c = ω{0}c
  • = γ
  • ω0
  • ω{0}c
  • Specification: γ satisfying certain compatibility condition

Look for µ determined by (consistent with) this γ: µ

  • X0 = ω0
  • X{0}c = ω{0}c
  • = γ
  • ω0
  • ω{0}c
slide-15
SLIDE 15

Introduction g-measures Gibbs measures Gibbs approach

Fields point of view

If the full train is available, why use only the past? Learn to predict a bit using past and future! Xn determined by finite-window probabilities P(Xn | Xn−1, Xn−2, . . . ; Xn+1, Xn+2, . . .) through conditional laws determined by a function γ s.t. P

  • X0 = ω0
  • X{0}c = ω{0}c
  • = γ
  • ω0
  • ω{0}c
  • Specification: γ satisfying certain compatibility condition

Look for µ determined by (consistent with) this γ: µ

  • X0 = ω0
  • X{0}c = ω{0}c
  • = γ
  • ω0
  • ω{0}c
slide-16
SLIDE 16

Introduction g-measures Gibbs measures Gibbs approach

Fields point of view

If the full train is available, why use only the past? Learn to predict a bit using past and future! Xn determined by finite-window probabilities P(Xn | Xn−1, Xn−2, . . . ; Xn+1, Xn+2, . . .) through conditional laws determined by a function γ s.t. P

  • X0 = ω0
  • X{0}c = ω{0}c
  • = γ
  • ω0
  • ω{0}c
  • Specification: γ satisfying certain compatibility condition

Look for µ determined by (consistent with) this γ: µ

  • X0 = ω0
  • X{0}c = ω{0}c
  • = γ
  • ω0
  • ω{0}c
slide-17
SLIDE 17

Introduction g-measures Gibbs measures Gibbs approach

Quasilocal measures

A specification γ is quasilocal if ∀ǫ > 0 ∃ n, m ≥ 0

  • γ
  • ω0
  • ωm

−nσ[n,m]c

  • − γ
  • ω0
  • ω{0}c
  • < ǫ

(2) for every σ, ω

◮ (2) is equivalent to γ(ω0 | · ) continuous in product topology ◮ Gibbs specifications are, in addition, strongly non-null

A probability measure µ is a quasilocal (Gibbs) measure if it is consistent with some quasilocal (Gibbs) specification Signal µ thought as non-causal or with anticipation

slide-18
SLIDE 18

Introduction g-measures Gibbs measures Gibbs approach

Quasilocal measures

A specification γ is quasilocal if ∀ǫ > 0 ∃ n, m ≥ 0

  • γ
  • ω0
  • ωm

−nσ[n,m]c

  • − γ
  • ω0
  • ω{0}c
  • < ǫ

(2) for every σ, ω

◮ (2) is equivalent to γ(ω0 | · ) continuous in product topology ◮ Gibbs specifications are, in addition, strongly non-null

A probability measure µ is a quasilocal (Gibbs) measure if it is consistent with some quasilocal (Gibbs) specification Signal µ thought as non-causal or with anticipation

slide-19
SLIDE 19

Introduction g-measures Gibbs measures Gibbs approach

Quasilocal measures

A specification γ is quasilocal if ∀ǫ > 0 ∃ n, m ≥ 0

  • γ
  • ω0
  • ωm

−nσ[n,m]c

  • − γ
  • ω0
  • ω{0}c
  • < ǫ

(2) for every σ, ω

◮ (2) is equivalent to γ(ω0 | · ) continuous in product topology ◮ Gibbs specifications are, in addition, strongly non-null

A probability measure µ is a quasilocal (Gibbs) measure if it is consistent with some quasilocal (Gibbs) specification Signal µ thought as non-causal or with anticipation

slide-20
SLIDE 20

Introduction g-measures Gibbs measures Gibbs approach

Quasilocal measures

A specification γ is quasilocal if ∀ǫ > 0 ∃ n, m ≥ 0

  • γ
  • ω0
  • ωm

−nσ[n,m]c

  • − γ
  • ω0
  • ω{0}c
  • < ǫ

(2) for every σ, ω

◮ (2) is equivalent to γ(ω0 | · ) continuous in product topology ◮ Gibbs specifications are, in addition, strongly non-null

A probability measure µ is a quasilocal (Gibbs) measure if it is consistent with some quasilocal (Gibbs) specification Signal µ thought as non-causal or with anticipation

slide-21
SLIDE 21

Introduction g-measures Gibbs measures Comparison

Questions, questions

Signals best described as processes or as Gibbs? Both setups give complementary information:

◮ Processes: ergodicity, coupling, renewal, perfect simulation ◮ Fields: Gibbs theory

Are these setups mathematically equivalent? Is every regular g-measure Gibbs and viceversa? What is more efficient: One or two-side conditioning? Efficiency vs interpretation?

slide-22
SLIDE 22

Introduction g-measures Gibbs measures Comparison

Questions, questions

Signals best described as processes or as Gibbs? Both setups give complementary information:

◮ Processes: ergodicity, coupling, renewal, perfect simulation ◮ Fields: Gibbs theory

Are these setups mathematically equivalent? Is every regular g-measure Gibbs and viceversa? What is more efficient: One or two-side conditioning? Efficiency vs interpretation?

slide-23
SLIDE 23

Introduction g-measures Gibbs measures Comparison

Questions, questions

Signals best described as processes or as Gibbs? Both setups give complementary information:

◮ Processes: ergodicity, coupling, renewal, perfect simulation ◮ Fields: Gibbs theory

Are these setups mathematically equivalent? Is every regular g-measure Gibbs and viceversa? What is more efficient: One or two-side conditioning? Efficiency vs interpretation?

slide-24
SLIDE 24

Introduction g-measures Gibbs measures Comparison

Questions, questions

Signals best described as processes or as Gibbs? Both setups give complementary information:

◮ Processes: ergodicity, coupling, renewal, perfect simulation ◮ Fields: Gibbs theory

Are these setups mathematically equivalent? Is every regular g-measure Gibbs and viceversa? What is more efficient: One or two-side conditioning? Efficiency vs interpretation?

slide-25
SLIDE 25

Introduction g-measures Gibbs measures Comparison

Questions, questions

Signals best described as processes or as Gibbs? Both setups give complementary information:

◮ Processes: ergodicity, coupling, renewal, perfect simulation ◮ Fields: Gibbs theory

Are these setups mathematically equivalent? Is every regular g-measure Gibbs and viceversa? What is more efficient: One or two-side conditioning? Efficiency vs interpretation?

slide-26
SLIDE 26

Introduction g-measures Gibbs measures History

Prehistory

◮ Onicescu-Mihoc (1935): chains with complete

connections

◮ Existence of limit measures in non-nul cases ◮ → random systems with complete connections (book by

Iosifescu and Grigorescu, Cambridge 1990)

◮ Doeblin-Fortet (1937):

◮ Taxonomy: A or B, dep. on continuity and non-nullness ◮ Existence of invariant measures ◮ Suggested: uniqueness of invariant measures (coupling!).

Completed by Iosifescu (1992)

◮ Harris (1955): chains of infinite order

◮ Framework of D-ary expansions ◮ Weaker uniqueness condition ◮ Cut-and-paste coupling

slide-27
SLIDE 27

Introduction g-measures Gibbs measures History

Prehistory

◮ Onicescu-Mihoc (1935): chains with complete

connections

◮ Existence of limit measures in non-nul cases ◮ → random systems with complete connections (book by

Iosifescu and Grigorescu, Cambridge 1990)

◮ Doeblin-Fortet (1937):

◮ Taxonomy: A or B, dep. on continuity and non-nullness ◮ Existence of invariant measures ◮ Suggested: uniqueness of invariant measures (coupling!).

Completed by Iosifescu (1992)

◮ Harris (1955): chains of infinite order

◮ Framework of D-ary expansions ◮ Weaker uniqueness condition ◮ Cut-and-paste coupling

slide-28
SLIDE 28

Introduction g-measures Gibbs measures History

Prehistory

◮ Onicescu-Mihoc (1935): chains with complete

connections

◮ Existence of limit measures in non-nul cases ◮ → random systems with complete connections (book by

Iosifescu and Grigorescu, Cambridge 1990)

◮ Doeblin-Fortet (1937):

◮ Taxonomy: A or B, dep. on continuity and non-nullness ◮ Existence of invariant measures ◮ Suggested: uniqueness of invariant measures (coupling!).

Completed by Iosifescu (1992)

◮ Harris (1955): chains of infinite order

◮ Framework of D-ary expansions ◮ Weaker uniqueness condition ◮ Cut-and-paste coupling

slide-29
SLIDE 29

Introduction g-measures Gibbs measures History

More recent history

◮ Keane (1972): g-measures (g-functions), existence and

uniqueness

◮ Ledrapier (1974): variational principle ◮ Walters (1975): relation with transfer operator theory ◮ Lalley (1986): list processes, regeneration, uniqueness ◮ Berbee (1987): uniqueness ◮ Kalikow (1990):

◮ random Markov processes ◮ uniform martingales

◮ Berger, Bramson, Bressaud, Comets, Dooley, F, Ferrari,

Galves, Grigorescu, Hoffman, Hulse, Iosifescu, Johansson, Lacroix, Maillard, ¨ Oberg, Pollicott, Quas, Stanflo, Sidoravicius, Theodorescu, . . .

slide-30
SLIDE 30

Introduction g-measures Gibbs measures History

More recent history

◮ Keane (1972): g-measures (g-functions), existence and

uniqueness

◮ Ledrapier (1974): variational principle ◮ Walters (1975): relation with transfer operator theory ◮ Lalley (1986): list processes, regeneration, uniqueness ◮ Berbee (1987): uniqueness ◮ Kalikow (1990):

◮ random Markov processes ◮ uniform martingales

◮ Berger, Bramson, Bressaud, Comets, Dooley, F, Ferrari,

Galves, Grigorescu, Hoffman, Hulse, Iosifescu, Johansson, Lacroix, Maillard, ¨ Oberg, Pollicott, Quas, Stanflo, Sidoravicius, Theodorescu, . . .

slide-31
SLIDE 31

Introduction g-measures Gibbs measures History

More recent history

◮ Keane (1972): g-measures (g-functions), existence and

uniqueness

◮ Ledrapier (1974): variational principle ◮ Walters (1975): relation with transfer operator theory ◮ Lalley (1986): list processes, regeneration, uniqueness ◮ Berbee (1987): uniqueness ◮ Kalikow (1990):

◮ random Markov processes ◮ uniform martingales

◮ Berger, Bramson, Bressaud, Comets, Dooley, F, Ferrari,

Galves, Grigorescu, Hoffman, Hulse, Iosifescu, Johansson, Lacroix, Maillard, ¨ Oberg, Pollicott, Quas, Stanflo, Sidoravicius, Theodorescu, . . .

slide-32
SLIDE 32

Introduction g-measures Gibbs measures History

More recent history

◮ Keane (1972): g-measures (g-functions), existence and

uniqueness

◮ Ledrapier (1974): variational principle ◮ Walters (1975): relation with transfer operator theory ◮ Lalley (1986): list processes, regeneration, uniqueness ◮ Berbee (1987): uniqueness ◮ Kalikow (1990):

◮ random Markov processes ◮ uniform martingales

◮ Berger, Bramson, Bressaud, Comets, Dooley, F, Ferrari,

Galves, Grigorescu, Hoffman, Hulse, Iosifescu, Johansson, Lacroix, Maillard, ¨ Oberg, Pollicott, Quas, Stanflo, Sidoravicius, Theodorescu, . . .

slide-33
SLIDE 33

Introduction g-measures Gibbs measures History

More recent history

◮ Keane (1972): g-measures (g-functions), existence and

uniqueness

◮ Ledrapier (1974): variational principle ◮ Walters (1975): relation with transfer operator theory ◮ Lalley (1986): list processes, regeneration, uniqueness ◮ Berbee (1987): uniqueness ◮ Kalikow (1990):

◮ random Markov processes ◮ uniform martingales

◮ Berger, Bramson, Bressaud, Comets, Dooley, F, Ferrari,

Galves, Grigorescu, Hoffman, Hulse, Iosifescu, Johansson, Lacroix, Maillard, ¨ Oberg, Pollicott, Quas, Stanflo, Sidoravicius, Theodorescu, . . .

slide-34
SLIDE 34

Introduction g-measures Gibbs measures History

More recent history

◮ Keane (1972): g-measures (g-functions), existence and

uniqueness

◮ Ledrapier (1974): variational principle ◮ Walters (1975): relation with transfer operator theory ◮ Lalley (1986): list processes, regeneration, uniqueness ◮ Berbee (1987): uniqueness ◮ Kalikow (1990):

◮ random Markov processes ◮ uniform martingales

◮ Berger, Bramson, Bressaud, Comets, Dooley, F, Ferrari,

Galves, Grigorescu, Hoffman, Hulse, Iosifescu, Johansson, Lacroix, Maillard, ¨ Oberg, Pollicott, Quas, Stanflo, Sidoravicius, Theodorescu, . . .

slide-35
SLIDE 35

Introduction g-measures Gibbs measures History

More recent history

◮ Keane (1972): g-measures (g-functions), existence and

uniqueness

◮ Ledrapier (1974): variational principle ◮ Walters (1975): relation with transfer operator theory ◮ Lalley (1986): list processes, regeneration, uniqueness ◮ Berbee (1987): uniqueness ◮ Kalikow (1990):

◮ random Markov processes ◮ uniform martingales

◮ Berger, Bramson, Bressaud, Comets, Dooley, F, Ferrari,

Galves, Grigorescu, Hoffman, Hulse, Iosifescu, Johansson, Lacroix, Maillard, ¨ Oberg, Pollicott, Quas, Stanflo, Sidoravicius, Theodorescu, . . .

slide-36
SLIDE 36

Introduction g-measures Gibbs measures Differences with Markov

Differences with Markov: Invariance

◮ Invariant measures: on space of trajectories (not just on A)

µ(x0) =

  • y

g

  • x0
  • y
  • µ(y)

− → µ(x0) =

  • g
  • x0
  • x−1

−∞

  • µ(dx−1

−∞) ◮ Conditioning is over measure zero events:

  • X−1

−∞ = x−1 −∞

  • ◮ Importance of “µ-almost surely”

◮ Properties must be essential = survive measure-zero changes

slide-37
SLIDE 37

Introduction g-measures Gibbs measures Differences with Markov

Differences with Markov: Invariance

◮ Invariant measures: on space of trajectories (not just on A)

µ(x0) =

  • y

g

  • x0
  • y
  • µ(y)

− → µ(x0) =

  • g
  • x0
  • x−1

−∞

  • µ(dx−1

−∞) ◮ Conditioning is over measure zero events:

  • X−1

−∞ = x−1 −∞

  • ◮ Importance of “µ-almost surely”

◮ Properties must be essential = survive measure-zero changes

slide-38
SLIDE 38

Introduction g-measures Gibbs measures Differences with Markov

Differences with Markov: Phase diagrams

There may be several invariant measures

◮ Not due to lack of ergodicity (non-null transitions) ◮ Different histories can lead to different invariant measures ◮ Analogous to statistical mechanics:

Many invariant measures = 1st order phase transitions Issues are, then, similar to those of stat mech:

◮ How many invariant measures? (= phase diagrams) ◮ Properties of measures? (mixing, extremality, ergodicity) ◮ Uniqueness criteria ◮ Simulation?

slide-39
SLIDE 39

Introduction g-measures Gibbs measures Differences with Markov

Differences with Markov: Phase diagrams

There may be several invariant measures

◮ Not due to lack of ergodicity (non-null transitions) ◮ Different histories can lead to different invariant measures ◮ Analogous to statistical mechanics:

Many invariant measures = 1st order phase transitions Issues are, then, similar to those of stat mech:

◮ How many invariant measures? (= phase diagrams) ◮ Properties of measures? (mixing, extremality, ergodicity) ◮ Uniqueness criteria ◮ Simulation?

slide-40
SLIDE 40

Introduction g-measures Gibbs measures Differences with Markov

Differences with Markov: Phase diagrams

There may be several invariant measures

◮ Not due to lack of ergodicity (non-null transitions) ◮ Different histories can lead to different invariant measures ◮ Analogous to statistical mechanics:

Many invariant measures = 1st order phase transitions Issues are, then, similar to those of stat mech:

◮ How many invariant measures? (= phase diagrams) ◮ Properties of measures? (mixing, extremality, ergodicity) ◮ Uniqueness criteria ◮ Simulation?

slide-41
SLIDE 41

Introduction g-measures Gibbs measures Differences with Markov

Differences with Markov: Phase diagrams

There may be several invariant measures

◮ Not due to lack of ergodicity (non-null transitions) ◮ Different histories can lead to different invariant measures ◮ Analogous to statistical mechanics:

Many invariant measures = 1st order phase transitions Issues are, then, similar to those of stat mech:

◮ How many invariant measures? (= phase diagrams) ◮ Properties of measures? (mixing, extremality, ergodicity) ◮ Uniqueness criteria ◮ Simulation?

slide-42
SLIDE 42

Introduction g-measures Gibbs measures Differences with Markov

Differences with Markov: Phase diagrams

There may be several invariant measures

◮ Not due to lack of ergodicity (non-null transitions) ◮ Different histories can lead to different invariant measures ◮ Analogous to statistical mechanics:

Many invariant measures = 1st order phase transitions Issues are, then, similar to those of stat mech:

◮ How many invariant measures? (= phase diagrams) ◮ Properties of measures? (mixing, extremality, ergodicity) ◮ Uniqueness criteria ◮ Simulation?

slide-43
SLIDE 43

Introduction g-measures Gibbs measures Differences with Markov

Differences with Markov: Phase diagrams

There may be several invariant measures

◮ Not due to lack of ergodicity (non-null transitions) ◮ Different histories can lead to different invariant measures ◮ Analogous to statistical mechanics:

Many invariant measures = 1st order phase transitions Issues are, then, similar to those of stat mech:

◮ How many invariant measures? (= phase diagrams) ◮ Properties of measures? (mixing, extremality, ergodicity) ◮ Uniqueness criteria ◮ Simulation?

slide-44
SLIDE 44

Introduction g-measures Gibbs measures Differences with Markov

Differences with Markov: Phase diagrams

There may be several invariant measures

◮ Not due to lack of ergodicity (non-null transitions) ◮ Different histories can lead to different invariant measures ◮ Analogous to statistical mechanics:

Many invariant measures = 1st order phase transitions Issues are, then, similar to those of stat mech:

◮ How many invariant measures? (= phase diagrams) ◮ Properties of measures? (mixing, extremality, ergodicity) ◮ Uniqueness criteria ◮ Simulation?

slide-45
SLIDE 45

Introduction g-measures Gibbs measures Differences with Markov

Differences with Markov: Phase diagrams

There may be several invariant measures

◮ Not due to lack of ergodicity (non-null transitions) ◮ Different histories can lead to different invariant measures ◮ Analogous to statistical mechanics:

Many invariant measures = 1st order phase transitions Issues are, then, similar to those of stat mech:

◮ How many invariant measures? (= phase diagrams) ◮ Properties of measures? (mixing, extremality, ergodicity) ◮ Uniqueness criteria ◮ Simulation?

slide-46
SLIDE 46

Introduction g-measures Gibbs measures Formal definitions

Transition probabilities

Basic structure:

◮ Space AZ with product σ-algebra F (and product topo) ◮ For Λ ⊂ Z, FΛ = {events depending on ωΛ} ⊂ F

Definition (i) A family of transition probabilities is a measurable function g

  • ·
  • ·
  • : A × An−1

−∞ −

→ [0, 1] such that

x0∈A g

  • x0
  • x−1

−∞

  • = 1

(ii) µ is a process consistent with g

  • ·
  • ·
  • if

µ({x0}) =

  • g
  • x0
  • y−1

−∞

  • µ(dy)
slide-47
SLIDE 47

Introduction g-measures Gibbs measures Formal definitions

Transition probabilities

Basic structure:

◮ Space AZ with product σ-algebra F (and product topo) ◮ For Λ ⊂ Z, FΛ = {events depending on ωΛ} ⊂ F

Definition (i) A family of transition probabilities is a measurable function g

  • ·
  • ·
  • : A × An−1

−∞ −

→ [0, 1] such that

x0∈A g

  • x0
  • x−1

−∞

  • = 1

(ii) µ is a process consistent with g

  • ·
  • ·
  • if

µ({x0}) =

  • g
  • x0
  • y−1

−∞

  • µ(dy)
slide-48
SLIDE 48

Introduction g-measures Gibbs measures Formal definitions

Transition probabilities

Basic structure:

◮ Space AZ with product σ-algebra F (and product topo) ◮ For Λ ⊂ Z, FΛ = {events depending on ωΛ} ⊂ F

Definition (i) A family of transition probabilities is a measurable function g

  • ·
  • ·
  • : A × An−1

−∞ −

→ [0, 1] such that

x0∈A g

  • x0
  • x−1

−∞

  • = 1

(ii) µ is a process consistent with g

  • ·
  • ·
  • if

µ({x0}) =

  • g
  • x0
  • y−1

−∞

  • µ(dy)
slide-49
SLIDE 49

Introduction g-measures Gibbs measures General results

General results (no hypotheses on g)

Let

◮ G(g) =

  • µ consistent with g
  • ◮ F−∞ :=

k∈Z F(−∞,k] (tail σ-algebra)

Theorem (a) G(g) is a convex set (b) µ is extreme in G(g) iff µ is trivial on F−∞ (µ(A) = 0, 1 for A ∈ F−∞) (c) µ is extreme in G(g) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(g) is determined by its restriction to F−∞ (e) µ = ν extreme in G(g) = ⇒ mutually singular on F−∞

slide-50
SLIDE 50

Introduction g-measures Gibbs measures General results

General results (no hypotheses on g)

Let

◮ G(g) =

  • µ consistent with g
  • ◮ F−∞ :=

k∈Z F(−∞,k] (tail σ-algebra)

Theorem (a) G(g) is a convex set (b) µ is extreme in G(g) iff µ is trivial on F−∞ (µ(A) = 0, 1 for A ∈ F−∞) (c) µ is extreme in G(g) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(g) is determined by its restriction to F−∞ (e) µ = ν extreme in G(g) = ⇒ mutually singular on F−∞

slide-51
SLIDE 51

Introduction g-measures Gibbs measures General results

General results (no hypotheses on g)

Let

◮ G(g) =

  • µ consistent with g
  • ◮ F−∞ :=

k∈Z F(−∞,k] (tail σ-algebra)

Theorem (a) G(g) is a convex set (b) µ is extreme in G(g) iff µ is trivial on F−∞ (µ(A) = 0, 1 for A ∈ F−∞) (c) µ is extreme in G(g) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(g) is determined by its restriction to F−∞ (e) µ = ν extreme in G(g) = ⇒ mutually singular on F−∞

slide-52
SLIDE 52

Introduction g-measures Gibbs measures General results

General results (no hypotheses on g)

Let

◮ G(g) =

  • µ consistent with g
  • ◮ F−∞ :=

k∈Z F(−∞,k] (tail σ-algebra)

Theorem (a) G(g) is a convex set (b) µ is extreme in G(g) iff µ is trivial on F−∞ (µ(A) = 0, 1 for A ∈ F−∞) (c) µ is extreme in G(g) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(g) is determined by its restriction to F−∞ (e) µ = ν extreme in G(g) = ⇒ mutually singular on F−∞

slide-53
SLIDE 53

Introduction g-measures Gibbs measures General results

General results (no hypotheses on g)

Let

◮ G(g) =

  • µ consistent with g
  • ◮ F−∞ :=

k∈Z F(−∞,k] (tail σ-algebra)

Theorem (a) G(g) is a convex set (b) µ is extreme in G(g) iff µ is trivial on F−∞ (µ(A) = 0, 1 for A ∈ F−∞) (c) µ is extreme in G(g) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(g) is determined by its restriction to F−∞ (e) µ = ν extreme in G(g) = ⇒ mutually singular on F−∞

slide-54
SLIDE 54

Introduction g-measures Gibbs measures General results

General results (no hypotheses on g)

Let

◮ G(g) =

  • µ consistent with g
  • ◮ F−∞ :=

k∈Z F(−∞,k] (tail σ-algebra)

Theorem (a) G(g) is a convex set (b) µ is extreme in G(g) iff µ is trivial on F−∞ (µ(A) = 0, 1 for A ∈ F−∞) (c) µ is extreme in G(g) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(g) is determined by its restriction to F−∞ (e) µ = ν extreme in G(g) = ⇒ mutually singular on F−∞

slide-55
SLIDE 55

Introduction g-measures Gibbs measures General results

Construction through limits

Let P[m,n] be the “window transition probabilities” g[m,n]

  • xn

m

  • xm−1

−∞

  • :=

g

  • xn
  • xn−1

−∞

  • g
  • xn−1
  • xn−2

−∞

  • · · · g
  • xm
  • xm−1

−∞

  • Theorem

If µ is extreme on G(g), then for µ-almost all y ∈ AZ, g[−ℓ,ℓ]

  • xn

m

  • y−ℓ−1

−∞

− − →

ℓ→∞ µ

  • {xn

m}

  • for all xn

m ∈ A[m,n] (no hypotheses on g)

slide-56
SLIDE 56

Introduction g-measures Gibbs measures General results

Construction through limits

Let P[m,n] be the “window transition probabilities” g[m,n]

  • xn

m

  • xm−1

−∞

  • :=

g

  • xn
  • xn−1

−∞

  • g
  • xn−1
  • xn−2

−∞

  • · · · g
  • xm
  • xm−1

−∞

  • Theorem

If µ is extreme on G(g), then for µ-almost all y ∈ AZ, g[−ℓ,ℓ]

  • xn

m

  • y−ℓ−1

−∞

− − →

ℓ→∞ µ

  • {xn

m}

  • for all xn

m ∈ A[m,n] (no hypotheses on g)

slide-57
SLIDE 57

Introduction g-measures Gibbs measures General results

Regular g-measures

Definition A measure µ on AZ is regular (continuous) if it is consistent with regular transition probabilities Theorem (Palmer, Parry and Walters (1977)) µ is a regular g-measure if and only if the sequence µ

  • ω0
  • ω−1

−n

  • converges uniformly in ω as n → ∞

Theorem If g is regular (continuous), then every limj g[ℓj,−ℓj]

  • ·
  • y−ℓj−1

−∞

  • defines a g-measure.
slide-58
SLIDE 58

Introduction g-measures Gibbs measures General results

Regular g-measures

Definition A measure µ on AZ is regular (continuous) if it is consistent with regular transition probabilities Theorem (Palmer, Parry and Walters (1977)) µ is a regular g-measure if and only if the sequence µ

  • ω0
  • ω−1

−n

  • converges uniformly in ω as n → ∞

Theorem If g is regular (continuous), then every limj g[ℓj,−ℓj]

  • ·
  • y−ℓj−1

−∞

  • defines a g-measure.
slide-59
SLIDE 59

Introduction g-measures Gibbs measures General results

Regular g-measures

Definition A measure µ on AZ is regular (continuous) if it is consistent with regular transition probabilities Theorem (Palmer, Parry and Walters (1977)) µ is a regular g-measure if and only if the sequence µ

  • ω0
  • ω−1

−n

  • converges uniformly in ω as n → ∞

Theorem If g is regular (continuous), then every limj g[ℓj,−ℓj]

  • ·
  • y−ℓj−1

−∞

  • defines a g-measure.
slide-60
SLIDE 60

Introduction g-measures Gibbs measures Uniqueness

Continuity rates

Uniqueness conditions: continuity and non-nulness hypotheses

◮ The continuity rate of g:

vark(g) := sup

x,y

  • g
  • x0
  • x−1

−∞

  • − g
  • x0
  • x−k

−1 y−k−1 −∞

  • ◮ The log-continuity rate of g:

vark(log g) := sup

x,y log

g

  • x0
  • x−1

−∞

  • g
  • x0
  • x−k

−1 y−k−1 −∞

  • ◮ The ∆-rate of g:

∆k(g) := inf

x,y

  • x0
  • g
  • x0
  • x−1

−∞

  • ∧ g
  • x0
  • x−k

−1 y−k−1 −∞

slide-61
SLIDE 61

Introduction g-measures Gibbs measures Uniqueness

Continuity rates

Uniqueness conditions: continuity and non-nulness hypotheses

◮ The continuity rate of g:

vark(g) := sup

x,y

  • g
  • x0
  • x−1

−∞

  • − g
  • x0
  • x−k

−1 y−k−1 −∞

  • ◮ The log-continuity rate of g:

vark(log g) := sup

x,y log

g

  • x0
  • x−1

−∞

  • g
  • x0
  • x−k

−1 y−k−1 −∞

  • ◮ The ∆-rate of g:

∆k(g) := inf

x,y

  • x0
  • g
  • x0
  • x−1

−∞

  • ∧ g
  • x0
  • x−k

−1 y−k−1 −∞

slide-62
SLIDE 62

Introduction g-measures Gibbs measures Uniqueness

Continuity rates

Uniqueness conditions: continuity and non-nulness hypotheses

◮ The continuity rate of g:

vark(g) := sup

x,y

  • g
  • x0
  • x−1

−∞

  • − g
  • x0
  • x−k

−1 y−k−1 −∞

  • ◮ The log-continuity rate of g:

vark(log g) := sup

x,y log

g

  • x0
  • x−1

−∞

  • g
  • x0
  • x−k

−1 y−k−1 −∞

  • ◮ The ∆-rate of g:

∆k(g) := inf

x,y

  • x0
  • g
  • x0
  • x−1

−∞

  • ∧ g
  • x0
  • x−k

−1 y−k−1 −∞

slide-63
SLIDE 63

Introduction g-measures Gibbs measures Uniqueness

Continuity rates

Uniqueness conditions: continuity and non-nulness hypotheses

◮ The continuity rate of g:

vark(g) := sup

x,y

  • g
  • x0
  • x−1

−∞

  • − g
  • x0
  • x−k

−1 y−k−1 −∞

  • ◮ The log-continuity rate of g:

vark(log g) := sup

x,y log

g

  • x0
  • x−1

−∞

  • g
  • x0
  • x−k

−1 y−k−1 −∞

  • ◮ The ∆-rate of g:

∆k(g) := inf

x,y

  • x0
  • g
  • x0
  • x−1

−∞

  • ∧ g
  • x0
  • x−k

−1 y−k−1 −∞

slide-64
SLIDE 64

Introduction g-measures Gibbs measures Uniqueness

Non-nullness hypotheses

◮ g is weakly non-null if

  • x0

inf

y g

  • x0
  • y−1

−∞

  • > 0

◮ g is (strongly) non-null if

inf

x0,y g

  • x0
  • y−1

−∞

  • > 0

[Doeblin-Fortet:

◮ Chain of type A: for g continuous and weakly non-null ◮ Chain of type B: for g log-continuous and non-null]

slide-65
SLIDE 65

Introduction g-measures Gibbs measures Uniqueness

Non-nullness hypotheses

◮ g is weakly non-null if

  • x0

inf

y g

  • x0
  • y−1

−∞

  • > 0

◮ g is (strongly) non-null if

inf

x0,y g

  • x0
  • y−1

−∞

  • > 0

[Doeblin-Fortet:

◮ Chain of type A: for g continuous and weakly non-null ◮ Chain of type B: for g log-continuous and non-null]

slide-66
SLIDE 66

Introduction g-measures Gibbs measures Uniqueness

Non-nullness hypotheses

◮ g is weakly non-null if

  • x0

inf

y g

  • x0
  • y−1

−∞

  • > 0

◮ g is (strongly) non-null if

inf

x0,y g

  • x0
  • y−1

−∞

  • > 0

[Doeblin-Fortet:

◮ Chain of type A: for g continuous and weakly non-null ◮ Chain of type B: for g log-continuous and non-null]

slide-67
SLIDE 67

Introduction g-measures Gibbs measures Criteria

Uniqueness criteria (selected)

◮ Doeblin-Fortet (1937 + Iosifescu, 1992): g non-null and

  • k

vark(g) < ∞

◮ Harris (1955): g weakly non-null and

  • n≥1

n

  • k=1
  • 1 − |E|

2 vark(g)

  • = +∞

◮ Berbee (1987): g non-null and

  • n≥1

exp

n

  • k=1

vark(log g)

  • = +∞
slide-68
SLIDE 68

Introduction g-measures Gibbs measures Criteria

Uniqueness criteria (selected)

◮ Doeblin-Fortet (1937 + Iosifescu, 1992): g non-null and

  • k

vark(g) < ∞

◮ Harris (1955): g weakly non-null and

  • n≥1

n

  • k=1
  • 1 − |E|

2 vark(g)

  • = +∞

◮ Berbee (1987): g non-null and

  • n≥1

exp

n

  • k=1

vark(log g)

  • = +∞
slide-69
SLIDE 69

Introduction g-measures Gibbs measures Criteria

Uniqueness criteria (selected)

◮ Doeblin-Fortet (1937 + Iosifescu, 1992): g non-null and

  • k

vark(g) < ∞

◮ Harris (1955): g weakly non-null and

  • n≥1

n

  • k=1
  • 1 − |E|

2 vark(g)

  • = +∞

◮ Berbee (1987): g non-null and

  • n≥1

exp

n

  • k=1

vark(log g)

  • = +∞
slide-70
SLIDE 70

Introduction g-measures Gibbs measures Criteria

Uniqueness criteria (cont.)

◮ Stenflo (2003): g non-null and

  • n≥1

n

  • k=1

∆k(g) = +∞,

◮ Johansson and ¨

Oberg (2002): g non-null and

  • k≥1

var2

k(log g) < +∞

slide-71
SLIDE 71

Introduction g-measures Gibbs measures Criteria

Uniqueness criteria (cont.)

◮ Stenflo (2003): g non-null and

  • n≥1

n

  • k=1

∆k(g) = +∞,

◮ Johansson and ¨

Oberg (2002): g non-null and

  • k≥1

var2

k(log g) < +∞

slide-72
SLIDE 72

Introduction g-measures Gibbs measures Criteria

Uniqueness criteria (cont.)

◮ Stenflo (2003): g non-null and

  • n≥1

n

  • k=1

∆k(g) = +∞,

◮ Johansson and ¨

Oberg (2002): g non-null and

  • k≥1

var2

k(log g) < +∞

slide-73
SLIDE 73

Introduction g-measures Gibbs measures Criteria

Comments

Leaving non-nullness aside, criteria are not fully comparable Rough comparison:

◮ Doeblin-Fortet: vark ∼ 1/k1+δ ◮ Harris–Stenflo: vark ∼ 1/k ◮ Johansson-¨

Oberg: vark ∼ 1/k1/2+δ

slide-74
SLIDE 74

Introduction g-measures Gibbs measures Criteria

Comments

Leaving non-nullness aside, criteria are not fully comparable Rough comparison:

◮ Doeblin-Fortet: vark ∼ 1/k1+δ ◮ Harris–Stenflo: vark ∼ 1/k ◮ Johansson-¨

Oberg: vark ∼ 1/k1/2+δ

slide-75
SLIDE 75

Introduction g-measures Gibbs measures Criteria

Criterion of a different species

Let

  • scj(g) :=

sup

x=y off j

  • g
  • x0
  • x−1

−∞

  • − g
  • x0
  • y−1

−∞

  • Then (F-Maillard, 2005) there is a unique consistent chain if
  • j<0

δj(g) < 1

◮ One-sided version of Dobrushin condition in stat. mech. ◮ This criterion is not comparable with precedent ones ◮ In particular no non-nullness requirement!

slide-76
SLIDE 76

Introduction g-measures Gibbs measures Criteria

Criterion of a different species

Let

  • scj(g) :=

sup

x=y off j

  • g
  • x0
  • x−1

−∞

  • − g
  • x0
  • y−1

−∞

  • Then (F-Maillard, 2005) there is a unique consistent chain if
  • j<0

δj(g) < 1

◮ One-sided version of Dobrushin condition in stat. mech. ◮ This criterion is not comparable with precedent ones ◮ In particular no non-nullness requirement!

slide-77
SLIDE 77

Introduction g-measures Gibbs measures Criteria

Criterion of a different species

Let

  • scj(g) :=

sup

x=y off j

  • g
  • x0
  • x−1

−∞

  • − g
  • x0
  • y−1

−∞

  • Then (F-Maillard, 2005) there is a unique consistent chain if
  • j<0

δj(g) < 1

◮ One-sided version of Dobrushin condition in stat. mech. ◮ This criterion is not comparable with precedent ones ◮ In particular no non-nullness requirement!

slide-78
SLIDE 78

Introduction g-measures Gibbs measures Criteria

Criterion of a different species

Let

  • scj(g) :=

sup

x=y off j

  • g
  • x0
  • x−1

−∞

  • − g
  • x0
  • y−1

−∞

  • Then (F-Maillard, 2005) there is a unique consistent chain if
  • j<0

δj(g) < 1

◮ One-sided version of Dobrushin condition in stat. mech. ◮ This criterion is not comparable with precedent ones ◮ In particular no non-nullness requirement!

slide-79
SLIDE 79

Introduction g-measures Gibbs measures Criteria

Criterion of a different species

Let

  • scj(g) :=

sup

x=y off j

  • g
  • x0
  • x−1

−∞

  • − g
  • x0
  • y−1

−∞

  • Then (F-Maillard, 2005) there is a unique consistent chain if
  • j<0

δj(g) < 1

◮ One-sided version of Dobrushin condition in stat. mech. ◮ This criterion is not comparable with precedent ones ◮ In particular no non-nullness requirement!

slide-80
SLIDE 80

Introduction g-measures Gibbs measures Non-uniqueness

Examples of non-uniqueness

◮ First example: Bramson and Kalikow (1993):

vark(g) ≥ C/ log |k|

◮ Berger, Hoffman and Sidoravicius (1993): Johansson-¨

Oberg criterion is sharp: For all ε > 0 there exists g with

  • k<0

var2+ǫ

k

(g) < ∞ and |G(P)| > 1

◮ Hulse (2006): One-sided Dobrushin criterion is sharp: For

all ε > 0 there exists g with

  • k<0
  • sck(g) = 1 + ǫ

and |G(P)| > 1

slide-81
SLIDE 81

Introduction g-measures Gibbs measures Non-uniqueness

Examples of non-uniqueness

◮ First example: Bramson and Kalikow (1993):

vark(g) ≥ C/ log |k|

◮ Berger, Hoffman and Sidoravicius (1993): Johansson-¨

Oberg criterion is sharp: For all ε > 0 there exists g with

  • k<0

var2+ǫ

k

(g) < ∞ and |G(P)| > 1

◮ Hulse (2006): One-sided Dobrushin criterion is sharp: For

all ε > 0 there exists g with

  • k<0
  • sck(g) = 1 + ǫ

and |G(P)| > 1

slide-82
SLIDE 82

Introduction g-measures Gibbs measures Non-uniqueness

Examples of non-uniqueness

◮ First example: Bramson and Kalikow (1993):

vark(g) ≥ C/ log |k|

◮ Berger, Hoffman and Sidoravicius (1993): Johansson-¨

Oberg criterion is sharp: For all ε > 0 there exists g with

  • k<0

var2+ǫ

k

(g) < ∞ and |G(P)| > 1

◮ Hulse (2006): One-sided Dobrushin criterion is sharp: For

all ε > 0 there exists g with

  • k<0
  • sck(g) = 1 + ǫ

and |G(P)| > 1

slide-83
SLIDE 83

Introduction g-measures Gibbs measures History

Gibbs measures: Historic highlights

Prehistory:

◮ Boltzmann, Maxwell (kinetic theory): Probability weights ◮ Gibbs: Geometry of phase diagrams

History:

◮ Dobrushin (1968), Lanford and Ruelle (1969): Conditional

expectations

◮ Preston (1973): Specifications ◮ Kozlov (1974), Sullivan (1973): Quasilocality and

Gibbsianness

slide-84
SLIDE 84

Introduction g-measures Gibbs measures History

Gibbs measures: Historic highlights

Prehistory:

◮ Boltzmann, Maxwell (kinetic theory): Probability weights ◮ Gibbs: Geometry of phase diagrams

History:

◮ Dobrushin (1968), Lanford and Ruelle (1969): Conditional

expectations

◮ Preston (1973): Specifications ◮ Kozlov (1974), Sullivan (1973): Quasilocality and

Gibbsianness

slide-85
SLIDE 85

Introduction g-measures Gibbs measures Statistical mechanics motivation

Equilibrium

Issue: Given microscopic behavior in finite regions, determine the macroscopic behavior Basic tenets: (i) Equilibrium = probability measure (ii) Finite regions = finite parts of an infinite system (iii) Exterior of a finite region = frozen external condition (iv) Macroscopic behavior = limit of infinite regions

slide-86
SLIDE 86

Introduction g-measures Gibbs measures Statistical mechanics motivation

Equilibrium

Issue: Given microscopic behavior in finite regions, determine the macroscopic behavior Basic tenets: (i) Equilibrium = probability measure (ii) Finite regions = finite parts of an infinite system (iii) Exterior of a finite region = frozen external condition (iv) Macroscopic behavior = limit of infinite regions

slide-87
SLIDE 87

Introduction g-measures Gibbs measures Statistical mechanics motivation

Equilibrium

Issue: Given microscopic behavior in finite regions, determine the macroscopic behavior Basic tenets: (i) Equilibrium = probability measure (ii) Finite regions = finite parts of an infinite system (iii) Exterior of a finite region = frozen external condition (iv) Macroscopic behavior = limit of infinite regions

slide-88
SLIDE 88

Introduction g-measures Gibbs measures Statistical mechanics motivation

Equilibrium = Probability kernels

Set up: Product space Ω = AL System in Λ ⋐ L described by a probability kernel γΛ( · | · ) γΛ(f | ω) = equilibrium value of f when the configuration outside Λ is ω Equilibrium in Λ = Equilibrium in every Λ′ ⊂ Λ. Equilibrium value of f in Λ = expectations in Λ′ with Λ \ Λ′ distributed according to the Λ-equilibrium γΛ(f | ω) = γΛ

  • γΛ′(f | · )
  • ω
  • (Λ′ ⊂ Λ ⋐ L)
slide-89
SLIDE 89

Introduction g-measures Gibbs measures Statistical mechanics motivation

Equilibrium = Probability kernels

Set up: Product space Ω = AL System in Λ ⋐ L described by a probability kernel γΛ( · | · ) γΛ(f | ω) = equilibrium value of f when the configuration outside Λ is ω Equilibrium in Λ = Equilibrium in every Λ′ ⊂ Λ. Equilibrium value of f in Λ = expectations in Λ′ with Λ \ Λ′ distributed according to the Λ-equilibrium γΛ(f | ω) = γΛ

  • γΛ′(f | · )
  • ω
  • (Λ′ ⊂ Λ ⋐ L)
slide-90
SLIDE 90

Introduction g-measures Gibbs measures Statistical mechanics motivation

Equilibrium = Probability kernels

Set up: Product space Ω = AL System in Λ ⋐ L described by a probability kernel γΛ( · | · ) γΛ(f | ω) = equilibrium value of f when the configuration outside Λ is ω Equilibrium in Λ = Equilibrium in every Λ′ ⊂ Λ. Equilibrium value of f in Λ = expectations in Λ′ with Λ \ Λ′ distributed according to the Λ-equilibrium γΛ(f | ω) = γΛ

  • γΛ′(f | · )
  • ω
  • (Λ′ ⊂ Λ ⋐ L)
slide-91
SLIDE 91

Introduction g-measures Gibbs measures Statistical mechanics motivation

Specifications

Definition A specification is a family γ = {γΛ : Λ ⋐ L} of probability kernels γΛ : F × Ω − → [0, 1] such that (i) External dependence: γΛ(f | ·) is FΛc-measurable (ii) Frozen external conditions: Each γΛ is proper, γΛ(h f | ω) = h(ω) γΛ(f | ω) if h depends only on ωΛc (iii) Equilibrium in finite regions: The family γ is consistent γ∆ γΛ = γ∆ if ∆ ⊃ Λ

slide-92
SLIDE 92

Introduction g-measures Gibbs measures Statistical mechanics motivation

Specifications

Definition A specification is a family γ = {γΛ : Λ ⋐ L} of probability kernels γΛ : F × Ω − → [0, 1] such that (i) External dependence: γΛ(f | ·) is FΛc-measurable (ii) Frozen external conditions: Each γΛ is proper, γΛ(h f | ω) = h(ω) γΛ(f | ω) if h depends only on ωΛc (iii) Equilibrium in finite regions: The family γ is consistent γ∆ γΛ = γ∆ if ∆ ⊃ Λ

slide-93
SLIDE 93

Introduction g-measures Gibbs measures Statistical mechanics motivation

Specifications

Definition A specification is a family γ = {γΛ : Λ ⋐ L} of probability kernels γΛ : F × Ω − → [0, 1] such that (i) External dependence: γΛ(f | ·) is FΛc-measurable (ii) Frozen external conditions: Each γΛ is proper, γΛ(h f | ω) = h(ω) γΛ(f | ω) if h depends only on ωΛc (iii) Equilibrium in finite regions: The family γ is consistent γ∆ γΛ = γ∆ if ∆ ⊃ Λ

slide-94
SLIDE 94

Introduction g-measures Gibbs measures Statistical mechanics motivation

Consistency

Definition A probability measure µ on Ω is consistent with γ if µ γΛ = µ for each Λ ⋐ L (DLR equations = equilibrium in infinite regions) Remarks

◮ Several consistent measures = first-order phase transition ◮ Specification ∼ system of regular conditional probabilities ◮ Difference: no apriori measure, hence conditions required

for all ω rather than almost surely

◮ Stat. mech.: conditional probabilities −

→ measures

slide-95
SLIDE 95

Introduction g-measures Gibbs measures Statistical mechanics motivation

Consistency

Definition A probability measure µ on Ω is consistent with γ if µ γΛ = µ for each Λ ⋐ L (DLR equations = equilibrium in infinite regions) Remarks

◮ Several consistent measures = first-order phase transition ◮ Specification ∼ system of regular conditional probabilities ◮ Difference: no apriori measure, hence conditions required

for all ω rather than almost surely

◮ Stat. mech.: conditional probabilities −

→ measures

slide-96
SLIDE 96

Introduction g-measures Gibbs measures Statistical mechanics motivation

Consistency

Definition A probability measure µ on Ω is consistent with γ if µ γΛ = µ for each Λ ⋐ L (DLR equations = equilibrium in infinite regions) Remarks

◮ Several consistent measures = first-order phase transition ◮ Specification ∼ system of regular conditional probabilities ◮ Difference: no apriori measure, hence conditions required

for all ω rather than almost surely

◮ Stat. mech.: conditional probabilities −

→ measures

slide-97
SLIDE 97

Introduction g-measures Gibbs measures Statistical mechanics motivation

Consistency

Definition A probability measure µ on Ω is consistent with γ if µ γΛ = µ for each Λ ⋐ L (DLR equations = equilibrium in infinite regions) Remarks

◮ Several consistent measures = first-order phase transition ◮ Specification ∼ system of regular conditional probabilities ◮ Difference: no apriori measure, hence conditions required

for all ω rather than almost surely

◮ Stat. mech.: conditional probabilities −

→ measures

slide-98
SLIDE 98

Introduction g-measures Gibbs measures General results

General results (no hypotheses on γ)

Let

◮ G(γ) =

  • µ consistent with γ
  • ◮ F∞ :=

Λ⋐L FΛc (σ-algebra at infinity)

Theorem (a) G(γ) is a convex set (b) µ is extreme in G(γ) iff µ is trivial on F∞ (µ(A) = 0, 1 for A ∈ F∞) (c) µ is extreme in G(γ) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(γ) is determined by its restriction to F∞ (e) µ = ν extreme in G(γ) = ⇒ mutually singular on F∞

slide-99
SLIDE 99

Introduction g-measures Gibbs measures General results

General results (no hypotheses on γ)

Let

◮ G(γ) =

  • µ consistent with γ
  • ◮ F∞ :=

Λ⋐L FΛc (σ-algebra at infinity)

Theorem (a) G(γ) is a convex set (b) µ is extreme in G(γ) iff µ is trivial on F∞ (µ(A) = 0, 1 for A ∈ F∞) (c) µ is extreme in G(γ) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(γ) is determined by its restriction to F∞ (e) µ = ν extreme in G(γ) = ⇒ mutually singular on F∞

slide-100
SLIDE 100

Introduction g-measures Gibbs measures General results

General results (no hypotheses on γ)

Let

◮ G(γ) =

  • µ consistent with γ
  • ◮ F∞ :=

Λ⋐L FΛc (σ-algebra at infinity)

Theorem (a) G(γ) is a convex set (b) µ is extreme in G(γ) iff µ is trivial on F∞ (µ(A) = 0, 1 for A ∈ F∞) (c) µ is extreme in G(γ) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(γ) is determined by its restriction to F∞ (e) µ = ν extreme in G(γ) = ⇒ mutually singular on F∞

slide-101
SLIDE 101

Introduction g-measures Gibbs measures General results

General results (no hypotheses on γ)

Let

◮ G(γ) =

  • µ consistent with γ
  • ◮ F∞ :=

Λ⋐L FΛc (σ-algebra at infinity)

Theorem (a) G(γ) is a convex set (b) µ is extreme in G(γ) iff µ is trivial on F∞ (µ(A) = 0, 1 for A ∈ F∞) (c) µ is extreme in G(γ) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(γ) is determined by its restriction to F∞ (e) µ = ν extreme in G(γ) = ⇒ mutually singular on F∞

slide-102
SLIDE 102

Introduction g-measures Gibbs measures General results

General results (no hypotheses on γ)

Let

◮ G(γ) =

  • µ consistent with γ
  • ◮ F∞ :=

Λ⋐L FΛc (σ-algebra at infinity)

Theorem (a) G(γ) is a convex set (b) µ is extreme in G(γ) iff µ is trivial on F∞ (µ(A) = 0, 1 for A ∈ F∞) (c) µ is extreme in G(γ) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(γ) is determined by its restriction to F∞ (e) µ = ν extreme in G(γ) = ⇒ mutually singular on F∞

slide-103
SLIDE 103

Introduction g-measures Gibbs measures General results

General results (no hypotheses on γ)

Let

◮ G(γ) =

  • µ consistent with γ
  • ◮ F∞ :=

Λ⋐L FΛc (σ-algebra at infinity)

Theorem (a) G(γ) is a convex set (b) µ is extreme in G(γ) iff µ is trivial on F∞ (µ(A) = 0, 1 for A ∈ F∞) (c) µ is extreme in G(γ) iff lim

Λ↑Z sup B∈FΛ−

  • µ(A ∩ B) − µ(A)µ(B)
  • = 0 ,

∀A ∈ F (d) Each µ ∈ G(γ) is determined by its restriction to F∞ (e) µ = ν extreme in G(γ) = ⇒ mutually singular on F∞

slide-104
SLIDE 104

Introduction g-measures Gibbs measures General results

Construction through limits

Theorem If µ is extreme on G(γ), then for µ-almost all σ ∈ Ω, γ∆

  • ωΛ
  • σ∆c

− − − →

∆→L µ

  • {ωΛ}
  • for all ω ∈ Ω (no hypotheses on γ)
slide-105
SLIDE 105

Introduction g-measures Gibbs measures General results

Quasilocality

Definition A measure µ on AL is quasilocal (continuous) if it is consistent with a quasilocal specification Theorem µ is quasilocal if and only if the sequence µ

  • ω0
  • ω−1

−n ωm 1

  • converges uniformly in ω as n, m → ∞

Theorem If γ is quasilocal, then every limj γΛj

  • ·
  • σΛc

j

  • , with Λj → L,

defines a consistent measure.

slide-106
SLIDE 106

Introduction g-measures Gibbs measures General results

Quasilocality

Definition A measure µ on AL is quasilocal (continuous) if it is consistent with a quasilocal specification Theorem µ is quasilocal if and only if the sequence µ

  • ω0
  • ω−1

−n ωm 1

  • converges uniformly in ω as n, m → ∞

Theorem If γ is quasilocal, then every limj γΛj

  • ·
  • σΛc

j

  • , with Λj → L,

defines a consistent measure.

slide-107
SLIDE 107

Introduction g-measures Gibbs measures General results

Quasilocality

Definition A measure µ on AL is quasilocal (continuous) if it is consistent with a quasilocal specification Theorem µ is quasilocal if and only if the sequence µ

  • ω0
  • ω−1

−n ωm 1

  • converges uniformly in ω as n, m → ∞

Theorem If γ is quasilocal, then every limj γΛj

  • ·
  • σΛc

j

  • , with Λj → L,

defines a consistent measure.

slide-108
SLIDE 108

Introduction g-measures Gibbs measures Stat Mech

Link with statistical mechanics

Definition A specification γ is

◮ non-null if infσ γΛ

  • ωΛ
  • σΛc

> 0 for ω ∈ Ω, Λ ⋐ L

◮ Gibbs if it is quasilocal and non-null

Theorem (Kozlov) A specification is Gibbsian iff it has the Boltzmann form γ

  • ωΛ
  • ωΛc

= exp

  • A∩Λ=∅

φA(ωA)

  • /Norm. ,

where {φA} (interaction) satisfy

  • A∋0

φA∞ < ∞ .

slide-109
SLIDE 109

Introduction g-measures Gibbs measures Stat Mech

Link with statistical mechanics

Definition A specification γ is

◮ non-null if infσ γΛ

  • ωΛ
  • σΛc

> 0 for ω ∈ Ω, Λ ⋐ L

◮ Gibbs if it is quasilocal and non-null

Theorem (Kozlov) A specification is Gibbsian iff it has the Boltzmann form γ

  • ωΛ
  • ωΛc

= exp

  • A∩Λ=∅

φA(ωA)

  • /Norm. ,

where {φA} (interaction) satisfy

  • A∋0

φA∞ < ∞ .

slide-110
SLIDE 110

Introduction g-measures Gibbs measures Phase transitions

Uniqueness and non-uniqueness

Uniqueness results

◮ Berbee: n≥1 exp

  • − n

k=1 vark(log γ)

  • = +∞

◮ Dobrushin: j<0 δj(g) < 1

Non-uniqueness results

◮ Fifty years of rigorous stat mech ◮ Markov models: Non-uniqueness in two or more dimensions

slide-111
SLIDE 111

Introduction g-measures Gibbs measures Phase transitions

Uniqueness and non-uniqueness

Uniqueness results

◮ Berbee: n≥1 exp

  • − n

k=1 vark(log γ)

  • = +∞

◮ Dobrushin: j<0 δj(g) < 1

Non-uniqueness results

◮ Fifty years of rigorous stat mech ◮ Markov models: Non-uniqueness in two or more dimensions

slide-112
SLIDE 112

Issues Positive Negative Other

Signal description: Process or Gibbs?

  • II. Relation between approaches

Contributors:

  • S. Berghout (Leiden)
  • A. van Enter (Groningen)
  • S. Gallo (S˜

ao Carlos),

  • G. Maillard (Aix-Marseille),
  • E. Verbitskiy (Leiden)

Florence in May, 2017

slide-113
SLIDE 113

Issues Positive Negative Other The questions

The issues

(I) Given a measure µ on AZ

◮ Is it always both a g and a Gibbs measure? ◮ If yes, which are the pros and cons of each point of view?

(II) Are g-functions and specifications in correspondance?

◮ Same uniqueness regions? ◮ Same phase diagrams?

(III) Can theoretical aspects be “imported”?

◮ Variational approach ◮ Large deviations

slide-114
SLIDE 114

Issues Positive Negative Other The questions

The issues

(I) Given a measure µ on AZ

◮ Is it always both a g and a Gibbs measure? ◮ If yes, which are the pros and cons of each point of view?

(II) Are g-functions and specifications in correspondance?

◮ Same uniqueness regions? ◮ Same phase diagrams?

(III) Can theoretical aspects be “imported”?

◮ Variational approach ◮ Large deviations

slide-115
SLIDE 115

Issues Positive Negative Other The questions

The issues

(I) Given a measure µ on AZ

◮ Is it always both a g and a Gibbs measure? ◮ If yes, which are the pros and cons of each point of view?

(II) Are g-functions and specifications in correspondance?

◮ Same uniqueness regions? ◮ Same phase diagrams?

(III) Can theoretical aspects be “imported”?

◮ Variational approach ◮ Large deviations

slide-116
SLIDE 116

Issues Positive Negative Other The maps

Mathematical formalization

Mathematically, there are three natural questions: (Q1) Is there a map b : g − → γg such that G(g) = G(γg)? (Q2) Is there a map c : γ − → gγ such that G(γ) = G(gγ)? (Q3) If so, are these map mutual inverses: bc = id = cb

  • γgγ = γ , gγg = g
  • ?

True for Markov (A finite) [Georgii, Chapter 3, uses eigenvalues]

slide-117
SLIDE 117

Issues Positive Negative Other The maps

Mathematical formalization

Mathematically, there are three natural questions: (Q1) Is there a map b : g − → γg such that G(g) = G(γg)? (Q2) Is there a map c : γ − → gγ such that G(γ) = G(gγ)? (Q3) If so, are these map mutual inverses: bc = id = cb

  • γgγ = γ , gγg = g
  • ?

True for Markov (A finite) [Georgii, Chapter 3, uses eigenvalues]

slide-118
SLIDE 118

Issues Positive Negative Other The maps

Mathematical formalization

Mathematically, there are three natural questions: (Q1) Is there a map b : g − → γg such that G(g) = G(γg)? (Q2) Is there a map c : γ − → gγ such that G(γ) = G(gγ)? (Q3) If so, are these map mutual inverses: bc = id = cb

  • γgγ = γ , gγg = g
  • ?

True for Markov (A finite) [Georgii, Chapter 3, uses eigenvalues]

slide-119
SLIDE 119

Issues Positive Negative Other The maps

Mathematical formalization

Mathematically, there are three natural questions: (Q1) Is there a map b : g − → γg such that G(g) = G(γg)? (Q2) Is there a map c : γ − → gγ such that G(γ) = G(gγ)? (Q3) If so, are these map mutual inverses: bc = id = cb

  • γgγ = γ , gγg = g
  • ?

True for Markov (A finite) [Georgii, Chapter 3, uses eigenvalues]

slide-120
SLIDE 120

Issues Positive Negative Other Positive answer to (Q1)

Construction of the map b

How would you construct a map b : g − → γg? Natural answer: γg

[k,ℓ]

  • ωℓ

k

  • σ
  • =

lim

n→∞

g[k,n]

  • ωℓ

k σn ℓ+1

  • σk−1

−∞

  • g[k,n]
  • σn

ℓ+1

  • σk−1

−∞

  • Need to guarantee that the limit exists for all σ

Definition A g function has good future if

◮ g is non-null and ◮ j δj(g) < ∞

slide-121
SLIDE 121

Issues Positive Negative Other Positive answer to (Q1)

Construction of the map b

How would you construct a map b : g − → γg? Natural answer: γg

[k,ℓ]

  • ωℓ

k

  • σ
  • =

lim

n→∞

g[k,n]

  • ωℓ

k σn ℓ+1

  • σk−1

−∞

  • g[k,n]
  • σn

ℓ+1

  • σk−1

−∞

  • Need to guarantee that the limit exists for all σ

Definition A g function has good future if

◮ g is non-null and ◮ j δj(g) < ∞

slide-122
SLIDE 122

Issues Positive Negative Other Positive answer to (Q1)

Construction of the map b

How would you construct a map b : g − → γg? Natural answer: γg

[k,ℓ]

  • ωℓ

k

  • σ
  • =

lim

n→∞

g[k,n]

  • ωℓ

k σn ℓ+1

  • σk−1

−∞

  • g[k,n]
  • σn

ℓ+1

  • σk−1

−∞

  • Need to guarantee that the limit exists for all σ

Definition A g function has good future if

◮ g is non-null and ◮ j δj(g) < ∞

slide-123
SLIDE 123

Issues Positive Negative Other Positive answer to (Q1)

Construction of the map b

How would you construct a map b : g − → γg? Natural answer: γg

[k,ℓ]

  • ωℓ

k

  • σ
  • =

lim

n→∞

g[k,n]

  • ωℓ

k σn ℓ+1

  • σk−1

−∞

  • g[k,n]
  • σn

ℓ+1

  • σk−1

−∞

  • Need to guarantee that the limit exists for all σ

Definition A g function has good future if

◮ g is non-null and ◮ j δj(g) < ∞

slide-124
SLIDE 124

Issues Positive Negative Other Positive answer to (Q1)

Denote

◮ ΘGF :=

  • g has GF
  • ◮ Π :=
  • γ quasilocal
  • ◮ Π1 :=
  • γ : |G(γ)| = 1
  • Theorem (g specification)

The previous prescription defines a map b : ΘGF → γ g → γg which satisfies (a) G(g) ⊂ G(γg) (b) b restricted to b−1(Π1) is one-to-one. Thus, if g ∈ b−1(Π1), G(g) = G(γg) = {µg}

slide-125
SLIDE 125

Issues Positive Negative Other Positive answer to (Q1)

Denote

◮ ΘGF :=

  • g has GF
  • ◮ Π :=
  • γ quasilocal
  • ◮ Π1 :=
  • γ : |G(γ)| = 1
  • Theorem (g specification)

The previous prescription defines a map b : ΘGF → γ g → γg which satisfies (a) G(g) ⊂ G(γg) (b) b restricted to b−1(Π1) is one-to-one. Thus, if g ∈ b−1(Π1), G(g) = G(γg) = {µg}

slide-126
SLIDE 126

Issues Positive Negative Other Positive answer to (Q1)

Denote

◮ ΘGF :=

  • g has GF
  • ◮ Π :=
  • γ quasilocal
  • ◮ Π1 :=
  • γ : |G(γ)| = 1
  • Theorem (g specification)

The previous prescription defines a map b : ΘGF → γ g → γg which satisfies (a) G(g) ⊂ G(γg) (b) b restricted to b−1(Π1) is one-to-one. Thus, if g ∈ b−1(Π1), G(g) = G(γg) = {µg}

slide-127
SLIDE 127

Issues Positive Negative Other Positive answer to (Q2)

Construction of the map c

The natural prescription is gγ ω0

  • σ−1

−∞

  • =

lim

n→∞ γ[0,n]

  • ω0
  • σ−1

−∞ ξ∞ n+1

  • provided that, for each σ,

◮ the limit exists and ◮ the limit is independent of ξ

Denote

◮ ΘHUC =

  • g:

j δj(g) < 1

  • ◮ ΠHUC :=
  • γ :

j δj(γ) < 1

  • Dobrushin condition provides hereditary uniqueness:

Uniqueness on each (infinite) Λ for any σΛc

slide-128
SLIDE 128

Issues Positive Negative Other Positive answer to (Q2)

Construction of the map c

The natural prescription is gγ ω0

  • σ−1

−∞

  • =

lim

n→∞ γ[0,n]

  • ω0
  • σ−1

−∞ ξ∞ n+1

  • provided that, for each σ,

◮ the limit exists and ◮ the limit is independent of ξ

Denote

◮ ΘHUC =

  • g:

j δj(g) < 1

  • ◮ ΠHUC :=
  • γ :

j δj(γ) < 1

  • Dobrushin condition provides hereditary uniqueness:

Uniqueness on each (infinite) Λ for any σΛc

slide-129
SLIDE 129

Issues Positive Negative Other Positive answer to (Q2)

Construction of the map c

The natural prescription is gγ ω0

  • σ−1

−∞

  • =

lim

n→∞ γ[0,n]

  • ω0
  • σ−1

−∞ ξ∞ n+1

  • provided that, for each σ,

◮ the limit exists and ◮ the limit is independent of ξ

Denote

◮ ΘHUC =

  • g:

j δj(g) < 1

  • ◮ ΠHUC :=
  • γ :

j δj(γ) < 1

  • Dobrushin condition provides hereditary uniqueness:

Uniqueness on each (infinite) Λ for any σΛc

slide-130
SLIDE 130

Issues Positive Negative Other Positive answer to (Q2)

Construction of the map c

The natural prescription is gγ ω0

  • σ−1

−∞

  • =

lim

n→∞ γ[0,n]

  • ω0
  • σ−1

−∞ ξ∞ n+1

  • provided that, for each σ,

◮ the limit exists and ◮ the limit is independent of ξ

Denote

◮ ΘHUC =

  • g:

j δj(g) < 1

  • ◮ ΠHUC :=
  • γ :

j δj(γ) < 1

  • Dobrushin condition provides hereditary uniqueness:

Uniqueness on each (infinite) Λ for any σΛc

slide-131
SLIDE 131

Issues Positive Negative Other Positive answer to (Q2)

Theorem (specification g) The previous prescription defines a map c : ΠHUC → ΘHUC γ → gγ which satisfies (a) G(fγ) = G(γ) = {µγ} (b) The map c is one-to-one.

slide-132
SLIDE 132

Issues Positive Negative Other Positive answer to (Q3)

Invertibility of the maps

Proofs of previous theorems yield bounds on δj(γg) and δj(gγ) Denote

◮ ΘEXP =

  • g : ∃ a > 1 s.t. limj→−∞ a|j| δj(g) = 0
  • ◮ ΠEXP =
  • γ : ∃ a > 1 s.t. limj→∞ aj δj(γ) = 0
  • Theorem (LIS specification)

(a) b ◦ c = Id over c−1(ΘGF), and G(gγ) = G(γ) = {µγ} (b) c ◦ b = Id over b−1(ΠHUC) and G(γf) = G(f) =

  • µf

(c) b and c establish a one-to-one correspondence between ΘEXP and ΠEXP that preserves the consistent measure.

slide-133
SLIDE 133

Issues Positive Negative Other Positive answer to (Q3)

Invertibility of the maps

Proofs of previous theorems yield bounds on δj(γg) and δj(gγ) Denote

◮ ΘEXP =

  • g : ∃ a > 1 s.t. limj→−∞ a|j| δj(g) = 0
  • ◮ ΠEXP =
  • γ : ∃ a > 1 s.t. limj→∞ aj δj(γ) = 0
  • Theorem (LIS specification)

(a) b ◦ c = Id over c−1(ΘGF), and G(gγ) = G(γ) = {µγ} (b) c ◦ b = Id over b−1(ΠHUC) and G(γf) = G(f) =

  • µf

(c) b and c establish a one-to-one correspondence between ΘEXP and ΠEXP that preserves the consistent measure.

slide-134
SLIDE 134

Issues Positive Negative Other Positive answer to (Q3)

Invertibility of the maps

Proofs of previous theorems yield bounds on δj(γg) and δj(gγ) Denote

◮ ΘEXP =

  • g : ∃ a > 1 s.t. limj→−∞ a|j| δj(g) = 0
  • ◮ ΠEXP =
  • γ : ∃ a > 1 s.t. limj→∞ aj δj(γ) = 0
  • Theorem (LIS specification)

(a) b ◦ c = Id over c−1(ΘGF), and G(gγ) = G(γ) = {µγ} (b) c ◦ b = Id over b−1(ΠHUC) and G(γf) = G(f) =

  • µf

(c) b and c establish a one-to-one correspondence between ΘEXP and ΠEXP that preserves the consistent measure.

slide-135
SLIDE 135

Issues Positive Negative Other Negative answer to (Q1)

A regular g that is not Gibbs

A = {0, 1}; denote ω = ω−1

−∞

Consider g-functions of the form g(1 | ω) = pℓ(ω) where

◮ ℓ(ω) = number of 0’s before first 1 looking backwards:

ℓ(ω) = min{j ≥ 0: ω−j−1 = 1}

◮ {pi}i≥0 ∈ (0, 1) satisfy

inf

i≥0 pi = ǫ > 0

, p∞ = lim

i→∞ pi .

slide-136
SLIDE 136

Issues Positive Negative Other Negative answer to (Q1)

A regular g that is not Gibbs

A = {0, 1}; denote ω = ω−1

−∞

Consider g-functions of the form g(1 | ω) = pℓ(ω) where

◮ ℓ(ω) = number of 0’s before first 1 looking backwards:

ℓ(ω) = min{j ≥ 0: ω−j−1 = 1}

◮ {pi}i≥0 ∈ (0, 1) satisfy

inf

i≥0 pi = ǫ > 0

, p∞ = lim

i→∞ pi .

slide-137
SLIDE 137

Issues Positive Negative Other Negative answer to (Q1)

Regularity

Non-nullness: g( · | · ) ≥ ǫ ∧ 1 − ǫ Continuity: sup

ω−1

−k=σ−1 −k

  • g
  • 1
  • ω
  • − g
  • 1
  • σ
  • =

sup

  • g
  • 1
  • 0−1

−kω−k−1 −∞

  • − g
  • 1
  • 0−1

−kσ−k−1 −∞

  • .

= sup

l,m≥k

|pl − pm| − →

k

slide-138
SLIDE 138

Issues Positive Negative Other Negative answer to (Q1)

Regularity

Non-nullness: g( · | · ) ≥ ǫ ∧ 1 − ǫ Continuity: sup

ω−1

−k=σ−1 −k

  • g
  • 1
  • ω
  • − g
  • 1
  • σ
  • =

sup

  • g
  • 1
  • 0−1

−kω−k−1 −∞

  • − g
  • 1
  • 0−1

−kσ−k−1 −∞

  • .

= sup

l,m≥k

|pl − pm| − →

k

slide-139
SLIDE 139

Issues Positive Negative Other Negative answer to (Q1)

Properties of the process

For all choices of sequences pi as above

◮ There exists a unique stationary chain µ compatible with g ◮ µ is supported on infinitely many 1’s with intervals of 0’s ◮ µ is a renewal chain with visible renewals ◮ µ can be perfectly simulated

For all practical purposes, chains are as regular as they can be Nevertheless, for some choices of pi the chains are not Gibbsian. Cause: problem when conditioning on “all 0”

slide-140
SLIDE 140

Issues Positive Negative Other Negative answer to (Q1)

Properties of the process

For all choices of sequences pi as above

◮ There exists a unique stationary chain µ compatible with g ◮ µ is supported on infinitely many 1’s with intervals of 0’s ◮ µ is a renewal chain with visible renewals ◮ µ can be perfectly simulated

For all practical purposes, chains are as regular as they can be Nevertheless, for some choices of pi the chains are not Gibbsian. Cause: problem when conditioning on “all 0”

slide-141
SLIDE 141

Issues Positive Negative Other Negative answer to (Q1)

Properties of the process

For all choices of sequences pi as above

◮ There exists a unique stationary chain µ compatible with g ◮ µ is supported on infinitely many 1’s with intervals of 0’s ◮ µ is a renewal chain with visible renewals ◮ µ can be perfectly simulated

For all practical purposes, chains are as regular as they can be Nevertheless, for some choices of pi the chains are not Gibbsian. Cause: problem when conditioning on “all 0”

slide-142
SLIDE 142

Issues Positive Negative Other Negative answer to (Q1)

Main result

Theorem There exist choices of {pi}i≥0 as above for which the sequences

  • µ
  • X0 = ω0
  • X−i−1 = 1, X−1

−i = 0j −i, Xj 1 = 0j 1, Xj+1 = 1

  • i,j≥1

does not converge as i, j → ∞. In particular µ(0 | · ) is essentially discontinuous at ω = 0+∞

−∞

slide-143
SLIDE 143

Issues Positive Negative Other Negative answer to (Q1)

Main result

Theorem There exist choices of {pi}i≥0 as above for which the sequences

  • µ
  • X0 = ω0
  • X−i−1 = 1, X−1

−i = 0j −i, Xj 1 = 0j 1, Xj+1 = 1

  • i,j≥1

does not converge as i, j → ∞. In particular µ(0 | · ) is essentially discontinuous at ω = 0+∞

−∞

slide-144
SLIDE 144

Issues Positive Negative Other Proof of no Q1

Proof of main result

It is based on the following Claim µ

  • X0 = ω0
  • X−i−1 = 1, Xj

−i = 0j −i, Xj+1 = 1

  • is determined by the ratio

j−1

  • k=0

1 − pk 1 − pk+i . Thus, discontinuity at 0+∞

−∞ ≡ pk s.t. this ratio oscillates

slide-145
SLIDE 145

Issues Positive Negative Other Proof of no Q1

Proof of main result

It is based on the following Claim µ

  • X0 = ω0
  • X−i−1 = 1, Xj

−i = 0j −i, Xj+1 = 1

  • is determined by the ratio

j−1

  • k=0

1 − pk 1 − pk+i . Thus, discontinuity at 0+∞

−∞ ≡ pk s.t. this ratio oscillates

slide-146
SLIDE 146

Issues Positive Negative Other Proof of no Q1

Proof (cont.)

Economical way: Define pk = 1 − (1 − p∞)ξvk so that

j−1

  • k=0

1 − pk 1 − pk+i = ξ

j−1

k=0(vk−vk+i)

Choose vk → 0, but such that j

k=0 vk oscillates

Example: ξ ∈ (1, (1 − p∞)−2) and vk = (−1)rk rk with rk = inf

  • i ≥ 1:

i

  • j=1

j ≥ k + 1

  • First terms:

−1 , 1 2 , 1 2 , −1 3 , −1 3 , −1 3 , 1 4 , 1 4 , 1 4 , 1 4 , . . .

slide-147
SLIDE 147

Issues Positive Negative Other Proof of no Q1

Proof (cont.)

Economical way: Define pk = 1 − (1 − p∞)ξvk so that

j−1

  • k=0

1 − pk 1 − pk+i = ξ

j−1

k=0(vk−vk+i)

Choose vk → 0, but such that j

k=0 vk oscillates

Example: ξ ∈ (1, (1 − p∞)−2) and vk = (−1)rk rk with rk = inf

  • i ≥ 1:

i

  • j=1

j ≥ k + 1

  • First terms:

−1 , 1 2 , 1 2 , −1 3 , −1 3 , −1 3 , 1 4 , 1 4 , 1 4 , 1 4 , . . .

slide-148
SLIDE 148

Issues Positive Negative Other Proof of no Q1

Proof of the claim

µ

  • X−i−1 = 1, Xj

−i = 0j −i, Xj+1 = 1

  • =

µ

  • X−i−1 = 1
  • µ
  • Xj−1

−i

= 0j+1

−i , Xj = 1

  • X−i−1 = 1
  • =

µ

  • X−i−1 = 1

i+j

  • k=0

(1 − pk) pi+j+1 Analogously µ

  • X−i−1 = 1, X−1

−i = 0−1 −i , X0 = 1, Xj−1 1

= 0j−1

1

, Xj+1 = 1

  • =

µ

  • X−i−1 = 1

i−1

  • k=0

(1 − pk)pi j−1

  • k=0

(1 − pk)pj

slide-149
SLIDE 149

Issues Positive Negative Other Proof of no Q1

Proof of the claim (cont.)

Hence µ

  • X0 = 0
  • X−i−1 = 1, Xj

−i = 0j −i, Xj+1 = 1

  • =

i+j

k=0(1 − pk)pi+j+1

i−1

k=0(1 − pk)pi

j−1

k=0(1 − pk)pj + i+j k=0(1 − pk)pi+j+1

=

  • 1 +

pi pj (1 − pi+j) pi+j+1

j−1

  • k=0

1 − pk 1 − pk+i −1 ∼

  • 1 +

p∞ (1 − p∞)

j−1

  • k=0

1 − pk 1 − pk+i −1

slide-150
SLIDE 150

Issues Positive Negative Other Negative answer to (Q2)

A Gibbs that is not regular g

[Bissacot, Endo, van Enter and Le Ny (2017)]

Consider Dyson models:

◮ A = {−1, 1}, L = Z ◮ Specification defined by

γ{0}

  • σ0
  • σ{0}c

= exp

  • β
  • j∈Z=0

σ0 σj |j|α

  • Norm.

for 1 < α < 2 At low temperature there is a phase transition: G(γ) = {µ+, µ−} with µ± = lim

n→∞ γ[−n,n](· | ±)

Theorem Let α∗ = 3 − log 3

log 2 ∈ (1, 2). Then, for each α ∈ (α∗, 2) the

measures µ± are not regular g at low enough temperatures.

slide-151
SLIDE 151

Issues Positive Negative Other Negative answer to (Q2)

A Gibbs that is not regular g

[Bissacot, Endo, van Enter and Le Ny (2017)]

Consider Dyson models:

◮ A = {−1, 1}, L = Z ◮ Specification defined by

γ{0}

  • σ0
  • σ{0}c

= exp

  • β
  • j∈Z=0

σ0 σj |j|α

  • Norm.

for 1 < α < 2 At low temperature there is a phase transition: G(γ) = {µ+, µ−} with µ± = lim

n→∞ γ[−n,n](· | ±)

Theorem Let α∗ = 3 − log 3

log 2 ∈ (1, 2). Then, for each α ∈ (α∗, 2) the

measures µ± are not regular g at low enough temperatures.

slide-152
SLIDE 152

Issues Positive Negative Other Negative answer to (Q2)

A Gibbs that is not regular g

[Bissacot, Endo, van Enter and Le Ny (2017)]

Consider Dyson models:

◮ A = {−1, 1}, L = Z ◮ Specification defined by

γ{0}

  • σ0
  • σ{0}c

= exp

  • β
  • j∈Z=0

σ0 σj |j|α

  • Norm.

for 1 < α < 2 At low temperature there is a phase transition: G(γ) = {µ+, µ−} with µ± = lim

n→∞ γ[−n,n](· | ±)

Theorem Let α∗ = 3 − log 3

log 2 ∈ (1, 2). Then, for each α ∈ (α∗, 2) the

measures µ± are not regular g at low enough temperatures.

slide-153
SLIDE 153

Issues Positive Negative Other Sketch of the argument

First ingredient of the argument: Interfaces

Crucial! [Cassandro, Merola, Picco and Rozikov (2014)]

Argument for µ+: Let α∗ < α < 2 and T low enough Under Dobrushin boundary conditions: σi = −1 i ≤ −1 +1 i ≥ L + 1 an interface develops at I∗ ∼ L/2 such that

◮ Mostly “−1” in [0, I∗) and “+1” on (I∗, L] ◮ Probability of displacing interface ∼ e−cL2−α

γ[0,L]

  • |I∗ − (L/2)| > ǫL
  • − +
  • ≤ f(ǫ) L e−cL2−α

(1)

slide-154
SLIDE 154

Issues Positive Negative Other Sketch of the argument

First ingredient of the argument: Interfaces

Crucial! [Cassandro, Merola, Picco and Rozikov (2014)]

Argument for µ+: Let α∗ < α < 2 and T low enough Under Dobrushin boundary conditions: σi = −1 i ≤ −1 +1 i ≥ L + 1 an interface develops at I∗ ∼ L/2 such that

◮ Mostly “−1” in [0, I∗) and “+1” on (I∗, L] ◮ Probability of displacing interface ∼ e−cL2−α

γ[0,L]

  • |I∗ − (L/2)| > ǫL
  • − +
  • ≤ f(ǫ) L e−cL2−α

(1)

slide-155
SLIDE 155

Issues Positive Negative Other Sketch of the argument

First ingredient of the argument: Interfaces

Crucial! [Cassandro, Merola, Picco and Rozikov (2014)]

Argument for µ+: Let α∗ < α < 2 and T low enough Under Dobrushin boundary conditions: σi = −1 i ≤ −1 +1 i ≥ L + 1 an interface develops at I∗ ∼ L/2 such that

◮ Mostly “−1” in [0, I∗) and “+1” on (I∗, L] ◮ Probability of displacing interface ∼ e−cL2−α

γ[0,L]

  • |I∗ − (L/2)| > ǫL
  • − +
  • ≤ f(ǫ) L e−cL2−α

(1)

slide-156
SLIDE 156

Issues Positive Negative Other Sketch of the argument

Second ingredient: Wetting

Flipping the left “−” beyond −N has an energy cost of at most

  • i∈[0,L]

j≤−N

1 |i − j| ∼ L Nα−1 negligible w.r.t. RHS of (1) if N is grows superlinearly with L: L Nα−1 = o(1) (2) Consequence: ∃ δ > 0 s.t. for each ǫ µ+ ωi

  • (−1)−1

−N

  • ≤ −δ

, i ∈ [0, (1 − ǫ)L/2] (3) for L large enough and N as in (2)

slide-157
SLIDE 157

Issues Positive Negative Other Sketch of the argument

Second ingredient: Wetting

Flipping the left “−” beyond −N has an energy cost of at most

  • i∈[0,L]

j≤−N

1 |i − j| ∼ L Nα−1 negligible w.r.t. RHS of (1) if N is grows superlinearly with L: L Nα−1 = o(1) (2) Consequence: ∃ δ > 0 s.t. for each ǫ µ+ ωi

  • (−1)−1

−N

  • ≤ −δ

, i ∈ [0, (1 − ǫ)L/2] (3) for L large enough and N as in (2)

slide-158
SLIDE 158

Issues Positive Negative Other Sketch of the argument

Third ingredient: Energy cost of alternating

Alternating spins in [−L1, 0] have a L1-independent energy cost max

ω

  • i∈[−L1,−1]

j∈[−L1,−1]

(1)i |i − j|α ωj ≤ c (4) with c independent of L1. From (1), (3) and (4): µ+ ω0

  • (ωalt)−1

−L1(−1)−L1−1 −N−L1

  • ≤ −δ

(5) for L large enough as long as L/N α−1 = o(1) and L1 = o(L).

slide-159
SLIDE 159

Issues Positive Negative Other Sketch of the argument

Third ingredient: Energy cost of alternating

Alternating spins in [−L1, 0] have a L1-independent energy cost max

ω

  • i∈[−L1,−1]

j∈[−L1,−1]

(1)i |i − j|α ωj ≤ c (4) with c independent of L1. From (1), (3) and (4): µ+ ω0

  • (ωalt)−1

−L1(−1)−L1−1 −N−L1

  • ≤ −δ

(5) for L large enough as long as L/N α−1 = o(1) and L1 = o(L).

slide-160
SLIDE 160

Issues Positive Negative Other Sketch of the argument

Conclusion

Analogously, conditioning on “+” in [−N, −1]: µ+ ω0

  • (ωalt)−1

−L1(+1)−L1−1 −N−L1

  • ≥ δ

(6) Hence, for L large enough

  • µ+

ω0

  • (ωalt)−1

−L1(+1)−L1−1 −N−L1

  • − µ+

ω0

  • (ωalt)−1

−L1(−1)−L1−1 −N−L1

  • > 2δ

Left-conditioning is not quasilocal (discontinuous w.r.t. past)

slide-161
SLIDE 161

Issues Positive Negative Other Sketch of the argument

Conclusion

Analogously, conditioning on “+” in [−N, −1]: µ+ ω0

  • (ωalt)−1

−L1(+1)−L1−1 −N−L1

  • ≥ δ

(6) Hence, for L large enough

  • µ+

ω0

  • (ωalt)−1

−L1(+1)−L1−1 −N−L1

  • − µ+

ω0

  • (ωalt)−1

−L1(−1)−L1−1 −N−L1

  • > 2δ

Left-conditioning is not quasilocal (discontinuous w.r.t. past)

slide-162
SLIDE 162

Issues Positive Negative Other Necessary and sufficient conditions

Review of additional issues and results

  • I. When a regular g is Gibbs

Theorem A regular g-measure is Gibbs iff the sequence

n

  • i=1

g

  • ωi
  • ωi−1

1

σ0 ω−1

−∞

  • g
  • ωi
  • ωi−1

1

η0 ω−1

−∞

  • converges, ∀ σ0, η0, uniformly on ω, as n → ∞
slide-163
SLIDE 163

Issues Positive Negative Other Reversible measures

  • II. Reversibility

Relation between left- and right-conditioning? Definition A regular g-measure is reversible if it is continuous w.r.t. the future: sup

ω,σ

  • µ
  • ω0
  • σn

1 ω∞ n+1

  • − µ
  • ω0
  • σ∞

1

  • < ǫ

Theorem A regular g-measure µ is reversible iff the sequence

n

  • i=1

g

  • ωi
  • ωi−1
  • g
  • ωi
  • ωi−1

1

  • converges uniformly on ω, as n → ∞, to a fction free of zeros
slide-164
SLIDE 164

Issues Positive Negative Other Reversible measures

  • II. Reversibility

Relation between left- and right-conditioning? Definition A regular g-measure is reversible if it is continuous w.r.t. the future: sup

ω,σ

  • µ
  • ω0
  • σn

1 ω∞ n+1

  • − µ
  • ω0
  • σ∞

1

  • < ǫ

Theorem A regular g-measure µ is reversible iff the sequence

n

  • i=1

g

  • ωi
  • ωi−1
  • g
  • ωi
  • ωi−1

1

  • converges uniformly on ω, as n → ∞, to a fction free of zeros
slide-165
SLIDE 165

Issues Positive Negative Other Reversible measures

  • II. Reversibility

Relation between left- and right-conditioning? Definition A regular g-measure is reversible if it is continuous w.r.t. the future: sup

ω,σ

  • µ
  • ω0
  • σn

1 ω∞ n+1

  • − µ
  • ω0
  • σ∞

1

  • < ǫ

Theorem A regular g-measure µ is reversible iff the sequence

n

  • i=1

g

  • ωi
  • ωi−1
  • g
  • ωi
  • ωi−1

1

  • converges uniformly on ω, as n → ∞, to a fction free of zeros
slide-166
SLIDE 166

Issues Positive Negative Other Known examples

Overview of examples

◮ ∃ non-reversible measures (example is also non-Gibbs) ◮ ∃ reversible g-measures with different left and right

continuity rates

◮ The above g- but non-Gibbs measure is reversible

slide-167
SLIDE 167

Issues Positive Negative Other Known examples

Overview of examples

◮ ∃ non-reversible measures (example is also non-Gibbs) ◮ ∃ reversible g-measures with different left and right

continuity rates

◮ The above g- but non-Gibbs measure is reversible

slide-168
SLIDE 168

Issues Positive Negative Other Known examples

Overview of examples

◮ ∃ non-reversible measures (example is also non-Gibbs) ◮ ∃ reversible g-measures with different left and right

continuity rates

◮ The above g- but non-Gibbs measure is reversible

slide-169
SLIDE 169

Issues Positive Negative Other

  • III. Singletons vs interval kernels

Transitions vs kernels

Asymmetry in conditional kernels:

◮ g-measures determined by single-time transitions

g

  • ·
  • ω−1

−∞

  • ◮ Gibbs measures determined by full specifications
  • γΛ
  • ·
  • ωΛc

: Λ ⋐ Z

  • To put approaches on a common ground

◮ g −

→ left-interval specifications (LIS)

◮ specifications −

→ γ{0} plus order-consistency

slide-170
SLIDE 170

Issues Positive Negative Other

  • III. Singletons vs interval kernels

Transitions vs kernels

Asymmetry in conditional kernels:

◮ g-measures determined by single-time transitions

g

  • ·
  • ω−1

−∞

  • ◮ Gibbs measures determined by full specifications
  • γΛ
  • ·
  • ωΛc

: Λ ⋐ Z

  • To put approaches on a common ground

◮ g −

→ left-interval specifications (LIS)

◮ specifications −

→ γ{0} plus order-consistency

slide-171
SLIDE 171

Issues Positive Negative Other III.1 LIS

Left-interval specifications

g-functions admit a specification-like framework. Denote

◮ J = set of bounded intervals in Z ◮ If [a, b] ∈ J , mΛ := b,

◮ F≤Λ := F(−∞,b] ◮ FΛ− := F(−∞,a−1]

The iterated-conditioning formula g[m,n]

  • ωn

m

  • ωn−1

−∞

  • = g
  • ωm
  • ωm−1

−∞

  • g
  • ωm−1
  • ωm−2

−∞

  • · · · g
  • ωn
  • ωn−1

−∞

  • defines a family of probability kernels G = {gΛ : Λ ∈ J } s.t.
slide-172
SLIDE 172

Issues Positive Negative Other III.1 LIS

Left-interval specifications

g-functions admit a specification-like framework. Denote

◮ J = set of bounded intervals in Z ◮ If [a, b] ∈ J , mΛ := b,

◮ F≤Λ := F(−∞,b] ◮ FΛ− := F(−∞,a−1]

The iterated-conditioning formula g[m,n]

  • ωn

m

  • ωn−1

−∞

  • = g
  • ωm
  • ωm−1

−∞

  • g
  • ωm−1
  • ωm−2

−∞

  • · · · g
  • ωn
  • ωn−1

−∞

  • defines a family of probability kernels G = {gΛ : Λ ∈ J } s.t.
slide-173
SLIDE 173

Issues Positive Negative Other III.1 LIS

Left-interval specifications

g-functions admit a specification-like framework. Denote

◮ J = set of bounded intervals in Z ◮ If [a, b] ∈ J , mΛ := b,

◮ F≤Λ := F(−∞,b] ◮ FΛ− := F(−∞,a−1]

The iterated-conditioning formula g[m,n]

  • ωn

m

  • ωn−1

−∞

  • = g
  • ωm
  • ωm−1

−∞

  • g
  • ωm−1
  • ωm−2

−∞

  • · · · g
  • ωn
  • ωn−1

−∞

  • defines a family of probability kernels G = {gΛ : Λ ∈ J } s.t.
slide-174
SLIDE 174

Issues Positive Negative Other III.1 LIS

Definition of LIS

(i) Increasing measurability: gΛ : F≤mΛ × Ω − → [0, 1] (ii) Dependence on past: gΛ(f | · ) is FΛ−-measurable (iii) Properness: For Λ ∈ J and f F≤Λ-measurable, gΛ(h f | ω) = h(ω) gΛ(f | ω) if h depends only on ωΛ− (iv) Consistency: For ∆, Λ ∈ J : ∆ ⊃ Λ, g∆ gΛ = g∆

  • ver F≤mΛ

Properties (i)–(iv): left interval-specification (LIS)

slide-175
SLIDE 175

Issues Positive Negative Other III.1 LIS

Definition of LIS

(i) Increasing measurability: gΛ : F≤mΛ × Ω − → [0, 1] (ii) Dependence on past: gΛ(f | · ) is FΛ−-measurable (iii) Properness: For Λ ∈ J and f F≤Λ-measurable, gΛ(h f | ω) = h(ω) gΛ(f | ω) if h depends only on ωΛ− (iv) Consistency: For ∆, Λ ∈ J : ∆ ⊃ Λ, g∆ gΛ = g∆

  • ver F≤mΛ

Properties (i)–(iv): left interval-specification (LIS)

slide-176
SLIDE 176

Issues Positive Negative Other III.1 LIS

Definition of LIS

(i) Increasing measurability: gΛ : F≤mΛ × Ω − → [0, 1] (ii) Dependence on past: gΛ(f | · ) is FΛ−-measurable (iii) Properness: For Λ ∈ J and f F≤Λ-measurable, gΛ(h f | ω) = h(ω) gΛ(f | ω) if h depends only on ωΛ− (iv) Consistency: For ∆, Λ ∈ J : ∆ ⊃ Λ, g∆ gΛ = g∆

  • ver F≤mΛ

Properties (i)–(iv): left interval-specification (LIS)

slide-177
SLIDE 177

Issues Positive Negative Other III.1 LIS

Definition of LIS

(i) Increasing measurability: gΛ : F≤mΛ × Ω − → [0, 1] (ii) Dependence on past: gΛ(f | · ) is FΛ−-measurable (iii) Properness: For Λ ∈ J and f F≤Λ-measurable, gΛ(h f | ω) = h(ω) gΛ(f | ω) if h depends only on ωΛ− (iv) Consistency: For ∆, Λ ∈ J : ∆ ⊃ Λ, g∆ gΛ = g∆

  • ver F≤mΛ

Properties (i)–(iv): left interval-specification (LIS)

slide-178
SLIDE 178

Issues Positive Negative Other III.1 LIS

Comments

Knowledge of the LIS G is equivalent to knowledge of g In particular G(G) = G(g): µ gΛ = µ ∀ Λ ∈ J ⇔ µ g = µ Observations:

◮ Unlike specifications, kernels apply to different σ-algebras ◮ Kernels only for intervals ◮ Nevertheless the theory for specifications can be adapted ◮ Generalization: L partially ordered (POS)

slide-179
SLIDE 179

Issues Positive Negative Other III.1 LIS

Comments

Knowledge of the LIS G is equivalent to knowledge of g In particular G(G) = G(g): µ gΛ = µ ∀ Λ ∈ J ⇔ µ g = µ Observations:

◮ Unlike specifications, kernels apply to different σ-algebras ◮ Kernels only for intervals ◮ Nevertheless the theory for specifications can be adapted ◮ Generalization: L partially ordered (POS)

slide-180
SLIDE 180

Issues Positive Negative Other III.1 LIS

Comments

Knowledge of the LIS G is equivalent to knowledge of g In particular G(G) = G(g): µ gΛ = µ ∀ Λ ∈ J ⇔ µ g = µ Observations:

◮ Unlike specifications, kernels apply to different σ-algebras ◮ Kernels only for intervals ◮ Nevertheless the theory for specifications can be adapted ◮ Generalization: L partially ordered (POS)

slide-181
SLIDE 181

Issues Positive Negative Other III.1 LIS

Comments

Knowledge of the LIS G is equivalent to knowledge of g In particular G(G) = G(g): µ gΛ = µ ∀ Λ ∈ J ⇔ µ g = µ Observations:

◮ Unlike specifications, kernels apply to different σ-algebras ◮ Kernels only for intervals ◮ Nevertheless the theory for specifications can be adapted ◮ Generalization: L partially ordered (POS)

slide-182
SLIDE 182

Issues Positive Negative Other III.1 LIS

Comments

Knowledge of the LIS G is equivalent to knowledge of g In particular G(G) = G(g): µ gΛ = µ ∀ Λ ∈ J ⇔ µ g = µ Observations:

◮ Unlike specifications, kernels apply to different σ-algebras ◮ Kernels only for intervals ◮ Nevertheless the theory for specifications can be adapted ◮ Generalization: L partially ordered (POS)

slide-183
SLIDE 183

Issues Positive Negative Other III.1 LIS

Comments

Knowledge of the LIS G is equivalent to knowledge of g In particular G(G) = G(g): µ gΛ = µ ∀ Λ ∈ J ⇔ µ g = µ Observations:

◮ Unlike specifications, kernels apply to different σ-algebras ◮ Kernels only for intervals ◮ Nevertheless the theory for specifications can be adapted ◮ Generalization: L partially ordered (POS)

slide-184
SLIDE 184

Issues Positive Negative Other III.2 Specifications from singletons

From singletons to specifications (general L)

Would like to generate kernels from the singletons γ{i} However, not any family of singletons is admissible Choice of internal regions lead to compatibility conditions Let us start with two sites:

◮ The consistency γ{i,j} = γ{i,j} γ{i} implies

γ{i,j}

  • σiσj
  • ω
  • = γ{i}
  • σi
  • σj ω{j}c

γ{i,j}

  • σj
  • ω
  • (7)

◮ On the other hand γ{i,j} = γ{i,j} γ{j} implies

γ{i,j}

  • σiσj
  • ω
  • = γ{j}
  • σj
  • σi ω{i}c

γ{i,j}

  • σi
  • ω
  • (8)
slide-185
SLIDE 185

Issues Positive Negative Other III.2 Specifications from singletons

From singletons to specifications (general L)

Would like to generate kernels from the singletons γ{i} However, not any family of singletons is admissible Choice of internal regions lead to compatibility conditions Let us start with two sites:

◮ The consistency γ{i,j} = γ{i,j} γ{i} implies

γ{i,j}

  • σiσj
  • ω
  • = γ{i}
  • σi
  • σj ω{j}c

γ{i,j}

  • σj
  • ω
  • (7)

◮ On the other hand γ{i,j} = γ{i,j} γ{j} implies

γ{i,j}

  • σiσj
  • ω
  • = γ{j}
  • σj
  • σi ω{i}c

γ{i,j}

  • σi
  • ω
  • (8)
slide-186
SLIDE 186

Issues Positive Negative Other III.2 Specifications from singletons

From (7)–(8) γ{i,j}

  • σi
  • ω
  • = γ{i}
  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c γ{i,j}
  • σj
  • ω
  • Summing over σi,

γ{i,j}

  • σj
  • ω
  • =
  • σi

γ{i}

  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c

−1 Inserting this in (7) γ{i,j}

  • σiσj
  • ω
  • =

γ{i}

  • σi
  • σj ω{j}c
  • σi

γ{i}

  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c

(9)

slide-187
SLIDE 187

Issues Positive Negative Other III.2 Specifications from singletons

From (7)–(8) γ{i,j}

  • σi
  • ω
  • = γ{i}
  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c γ{i,j}
  • σj
  • ω
  • Summing over σi,

γ{i,j}

  • σj
  • ω
  • =
  • σi

γ{i}

  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c

−1 Inserting this in (7) γ{i,j}

  • σiσj
  • ω
  • =

γ{i}

  • σi
  • σj ω{j}c
  • σi

γ{i}

  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c

(9)

slide-188
SLIDE 188

Issues Positive Negative Other III.2 Specifications from singletons

From (7)–(8) γ{i,j}

  • σi
  • ω
  • = γ{i}
  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c γ{i,j}
  • σj
  • ω
  • Summing over σi,

γ{i,j}

  • σj
  • ω
  • =
  • σi

γ{i}

  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c

−1 Inserting this in (7) γ{i,j}

  • σiσj
  • ω
  • =

γ{i}

  • σi
  • σj ω{j}c
  • σi

γ{i}

  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c

(9)

slide-189
SLIDE 189

Issues Positive Negative Other III.2 Specifications from singletons

Order-consistency condition

Using, instead, (8) we similarly arrive to the i ↔ j expression: γ{i,j}

  • σiσj
  • ω
  • =

γ{j}

  • σj
  • σi ω{i}c
  • σj

γ{j}

  • σj
  • σi ω{i}c

γ{i}

  • σi
  • σj ω{j}c

(10) RHS of (9) = RHS of (10) = ⇒ order-consistency condition: γ{i}

  • σi
  • σj ω{j}c
  • σi

γ{i}

  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c

= γ{j}

  • σj
  • σi ω{i}c
  • σj

γ{j}

  • σj
  • σi ω{i}c

γ{i}

  • σi
  • σj ω{j}c

(11)

slide-190
SLIDE 190

Issues Positive Negative Other III.2 Specifications from singletons

Order-consistency condition

Using, instead, (8) we similarly arrive to the i ↔ j expression: γ{i,j}

  • σiσj
  • ω
  • =

γ{j}

  • σj
  • σi ω{i}c
  • σj

γ{j}

  • σj
  • σi ω{i}c

γ{i}

  • σi
  • σj ω{j}c

(10) RHS of (9) = RHS of (10) = ⇒ order-consistency condition: γ{i}

  • σi
  • σj ω{j}c
  • σi

γ{i}

  • σi
  • σj ω{j}c

γ{j}

  • σj
  • σi ω{i}c

= γ{j}

  • σj
  • σi ω{i}c
  • σj

γ{j}

  • σj
  • σi ω{i}c

γ{i}

  • σi
  • σj ω{j}c

(11)

slide-191
SLIDE 191

Issues Positive Negative Other III.2 Specifications from singletons

The reconstruction theorem

Further compatibility conditions from other Λ ⋐ L? Miracle! (11) is enough Theorem If (11) hold for all i, j ∈ L, ω ∈ Ω (denominators ¿ 0!), then

◮ ∃ exactly one γ with the given single-site kernels, defined by

γΛ∪Γ

  • σλσΓ
  • ω
  • =

γΓ

  • σΓ
  • σΛ ωΛc
  • σΓ

γΓ

  • σΓ
  • σΛ ωΛc

γΛ

  • σΛ
  • σΓ ωΓc

◮ Furthermore, such γ satisfies:

◮ G(γ) =

  • µ : µ γ{i} = µ ∀ i ∈ L
  • ◮ γ is quasilocal (resp. non-null) iff so are the γ{i}
slide-192
SLIDE 192

Issues Positive Negative Other III.2 Specifications from singletons

The reconstruction theorem

Further compatibility conditions from other Λ ⋐ L? Miracle! (11) is enough Theorem If (11) hold for all i, j ∈ L, ω ∈ Ω (denominators ¿ 0!), then

◮ ∃ exactly one γ with the given single-site kernels, defined by

γΛ∪Γ

  • σλσΓ
  • ω
  • =

γΓ

  • σΓ
  • σΛ ωΛc
  • σΓ

γΓ

  • σΓ
  • σΛ ωΛc

γΛ

  • σΛ
  • σΓ ωΓc

◮ Furthermore, such γ satisfies:

◮ G(γ) =

  • µ : µ γ{i} = µ ∀ i ∈ L
  • ◮ γ is quasilocal (resp. non-null) iff so are the γ{i}
slide-193
SLIDE 193

Issues Positive Negative Other III.2 Specifications from singletons

The reconstruction theorem

Further compatibility conditions from other Λ ⋐ L? Miracle! (11) is enough Theorem If (11) hold for all i, j ∈ L, ω ∈ Ω (denominators ¿ 0!), then

◮ ∃ exactly one γ with the given single-site kernels, defined by

γΛ∪Γ

  • σλσΓ
  • ω
  • =

γΓ

  • σΓ
  • σΛ ωΛc
  • σΓ

γΓ

  • σΓ
  • σΛ ωΛc

γΛ

  • σΛ
  • σΓ ωΓc

◮ Furthermore, such γ satisfies:

◮ G(γ) =

  • µ : µ γ{i} = µ ∀ i ∈ L
  • ◮ γ is quasilocal (resp. non-null) iff so are the γ{i}
slide-194
SLIDE 194

Issues Positive Negative Other III.2 Specifications from singletons

Comments

◮ Consistency condition (11) are automatically satisfied if

◮ Singletons come from a specification. Hence theorem shows

that a specification is uniquely defined by singletons [Georgii’s Theorem 1.33]

◮ Singletons come from a pre-existing measure µ:

γi

  • ωi
  • ω
  • =

lim

n→∞

µ(ωVn) µ(ωVn\{i}) for an exhausting sequence of volumes {Vn}

◮ Dachian and Nahapetian (2001) provided alternative

construction (weaker non-nullness, stronger

  • rder-consistency)

◮ Reconstruction also with very weak non-nullness

slide-195
SLIDE 195

Issues Positive Negative Other III.2 Specifications from singletons

Comments

◮ Consistency condition (11) are automatically satisfied if

◮ Singletons come from a specification. Hence theorem shows

that a specification is uniquely defined by singletons [Georgii’s Theorem 1.33]

◮ Singletons come from a pre-existing measure µ:

γi

  • ωi
  • ω
  • =

lim

n→∞

µ(ωVn) µ(ωVn\{i}) for an exhausting sequence of volumes {Vn}

◮ Dachian and Nahapetian (2001) provided alternative

construction (weaker non-nullness, stronger

  • rder-consistency)

◮ Reconstruction also with very weak non-nullness

slide-196
SLIDE 196

Issues Positive Negative Other III.2 Specifications from singletons

Comments

◮ Consistency condition (11) are automatically satisfied if

◮ Singletons come from a specification. Hence theorem shows

that a specification is uniquely defined by singletons [Georgii’s Theorem 1.33]

◮ Singletons come from a pre-existing measure µ:

γi

  • ωi
  • ω
  • =

lim

n→∞

µ(ωVn) µ(ωVn\{i}) for an exhausting sequence of volumes {Vn}

◮ Dachian and Nahapetian (2001) provided alternative

construction (weaker non-nullness, stronger

  • rder-consistency)

◮ Reconstruction also with very weak non-nullness

slide-197
SLIDE 197

Issues Positive Negative Other Conclusion

Final comments

The general mathematical framework is clear enough:

◮ Gibbs and g have comparable but not identical theories ◮ General theory: partially ordered specifications

What about practical considerations?

◮ In some cases one theory is applicable but not the other ◮ “Numerical” criteria to detect these cases? ◮ If both theories applicable: “numerical efficiency”?

slide-198
SLIDE 198

Issues Positive Negative Other Conclusion

Final comments

The general mathematical framework is clear enough:

◮ Gibbs and g have comparable but not identical theories ◮ General theory: partially ordered specifications

What about practical considerations?

◮ In some cases one theory is applicable but not the other ◮ “Numerical” criteria to detect these cases? ◮ If both theories applicable: “numerical efficiency”?

slide-199
SLIDE 199

Issues Positive Negative Other Conclusion

Final comments

The general mathematical framework is clear enough:

◮ Gibbs and g have comparable but not identical theories ◮ General theory: partially ordered specifications

What about practical considerations?

◮ In some cases one theory is applicable but not the other ◮ “Numerical” criteria to detect these cases? ◮ If both theories applicable: “numerical efficiency”?