Chapter 5: Concentration The Probabilistic Method Summer 2020 - - PowerPoint PPT Presentation

β–Ά
chapter 5 concentration
SMART_READER_LITE
LIVE PREVIEW

Chapter 5: Concentration The Probabilistic Method Summer 2020 - - PowerPoint PPT Presentation

Chapter 5: Concentration The Probabilistic Method Summer 2020 Freie Universitt Berlin Chapter Overview Prove some strong concentration inequalities Improve bounds on Ramsey numbers Study Hamiltonicity and chromatic number of


slide-1
SLIDE 1

Chapter 5: Concentration

The Probabilistic Method Summer 2020 Freie UniversitΓ€t Berlin

slide-2
SLIDE 2

Chapter Overview

  • Prove some strong concentration inequalities
  • Improve bounds on Ramsey numbers
  • Study Hamiltonicity and chromatic number of 𝐻 π‘œ, π‘ž
slide-3
SLIDE 3

Β§1 Chernoff Bounds

Chapter 5: Concentration The Probabilistic Method

slide-4
SLIDE 4

Domination vs Minimum Degree

Homework exercise

  • Show bound is tight by consider 𝐻 π‘œ, π‘ž

Degrees in 𝐻 π‘œ, π‘ž

  • Degree of a vertex ∼ Bin π‘œ βˆ’ 1, π‘ž
  • β‡’ expected degree is π‘œ βˆ’ 1 π‘ž
  • Minimum degree: need to show not far from mean
  • Suppose β„™ | deg 𝑀 βˆ’ π‘œ βˆ’ 1 π‘ž| β‰₯ 𝑏 <

1 2π‘œ

  • Union bound β‡’ β„™ πœ€ 𝐻 π‘œ, π‘ž

β‰₯ π‘œ βˆ’ 1 π‘ž βˆ’ 𝑒 >

1 2

Corollary 2.2.5 Let 𝐻 be an π‘œ-vertex graph with πœ€ 𝐻 β‰₯ πœ€. Then 𝐻 has a dominating set 𝑇 βŠ† π‘Š 𝐻 with 𝑇 ≀

ln πœ€+1 +1 πœ€+1

π‘œ.

slide-5
SLIDE 5

Comparing Bounds

Concentration inequalities

  • Let π‘Œ ∼ Bin π‘œ βˆ’ 1, π‘ž , π‘ž ≀

1 2

  • 𝔽 π‘Œ = π‘œ βˆ’ 1 π‘ž , Var π‘Œ = π‘œ βˆ’ 1 π‘ž 1 βˆ’ π‘ž = Θ π‘œπ‘ž
  • Markov:

β„™ π‘Œ β‰₯ 𝑏 ≀

𝔽 π‘Œ 𝑏

  • β‡’ error probability <

1 2π‘œ for 𝑏 = Ξ© π‘œ

  • Chebyshev:

β„™ π‘Œ βˆ’ 𝔽 π‘Œ β‰₯ 𝑏 ≀

Var π‘Œ 𝑏2

  • β‡’ error probability <

1 2π‘œ for 𝑏 = Ξ© π‘œ π‘ž

  • Central Limit Theorem:

β„™ π‘Œ βˆ’ 𝔽 π‘Œ β‰₯ 𝑏 ≀ exp

βˆ’π‘‘π‘2 Var π‘Œ

  • β‡’ error probability <

1 2π‘œ for 𝑏 = Ξ©

π‘œπ‘ž log π‘œ

slide-6
SLIDE 6

Definition 5.1.1 Let π‘‡π‘œ = σ𝑗=1

π‘œ

π‘Œπ‘—, where the π‘Œπ‘— are independently and uniformly distributed on βˆ’1,1 .

The Problem With CLT

Asymptotics

  • Central Limit Theorem is asymptotic, valid as π‘œ β†’ ∞
  • We would like a quantitative bound for some given π‘œ

Binomial connection

  • We have π‘‡π‘œ ∼ 2 Bin π‘œ,

1 2 βˆ’ π‘œ 2

  • Convenient to translate so mean is zero

Goal

  • Show π‘‡π‘œ is exponentially unlikely to be far from zero
slide-7
SLIDE 7

Theorem 5.1.2 (Symmetric Chernoff Bound) For every 𝑏 > 0, we have β„™ π‘‡π‘œ β‰₯ 𝑏 ≀ exp βˆ’

𝑏2 2π‘œ .

Chernoff Bounds

Remarks

  • Concrete bounds for all π‘œ, 𝑏
  • Symmetry: same bound for β„™(π‘‡π‘œ ≀ βˆ’π‘)
  • Bin π‘œ,

1 2 = 1 2 π‘‡π‘œ + π‘œ

  • β‡’ concentration for binomial random variables

Corollary 5.1.3 For every 𝑏 > 0, we have β„™ Bin π‘œ,

1 2 βˆ’ π‘œ 2 β‰₯ 𝑏 ≀ 2 exp βˆ’2𝑏2 π‘œ

.

slide-8
SLIDE 8

Proving Chernoff

Proof

  • Exponential conversion
  • π‘‡π‘œ β‰₯ 𝑏 = π‘“π‘‡π‘œ β‰₯ 𝑓𝑏 = π‘“πœ‡π‘‡π‘œ β‰₯ π‘“πœ‡π‘
  • Concentration
  • π‘“πœ‡π‘‡π‘œ a non-negative random variable
  • Markov: β„™ π‘“πœ‡π‘‡π‘œ β‰₯ π‘“πœ‡π‘ ≀ 𝔽 π‘“πœ‡π‘‡π‘œ π‘“βˆ’πœ‡π‘
  • Expectation
  • Recall π‘‡π‘œ = σ𝑗=1

π‘œ

π‘Œπ‘—

  • β‡’ π‘“πœ‡π‘‡π‘œ = ς𝑗=1

π‘œ

π‘“πœ‡π‘Œπ‘—

  • Independence β‡’ 𝔽 π‘“πœ‡π‘‡π‘œ = ς𝑗=1

π‘œ

𝔽 π‘“πœ‡π‘Œπ‘— =

1 2 π‘“πœ‡ + π‘“βˆ’πœ‡ π‘œ

= coshπ‘œ πœ‡

Theorem 5.1.2 (Symmetric Chernoff Bound) For every 𝑏 > 0, we have β„™ π‘‡π‘œ β‰₯ 𝑏 ≀ exp βˆ’

𝑏2 2π‘œ .

slide-9
SLIDE 9

Over the Cosh

Recall

  • β„™ π‘‡π‘œ β‰₯ 𝑏 ≀ 𝔽 π‘“πœ‡π‘‡π‘œ π‘“βˆ’πœ‡π‘
  • 𝔽 π‘“πœ‡π‘‡π‘œ = coshπ‘œ πœ‡

A little calculus

  • cosh 𝑦 =

1 2 𝑓𝑦 + π‘“βˆ’π‘¦

  • Taylor series: 𝑓𝑦 = 1 + 𝑦 +

𝑦2 2 + 𝑦3 6 + 𝑦4 24 + β‹―

  • β‡’ cosh 𝑦 = 1 +

𝑦2 2 + 𝑦4 24 + 𝑦6 720 + β‹― ≀ 1 + 𝑦2 2 + 𝑦4 8 + 𝑦6 48 + β‹― = 𝑓

𝑦2 2

Finishing the proof

  • ∴ β„™ π‘‡π‘œ β‰₯ 𝑏 ≀ exp

1 2 π‘œπœ‡2 βˆ’ πœ‡π‘

  • Minimise: πœ‡ =

𝑏 π‘œ β‡’ β„™ π‘‡π‘œ β‰₯ 𝑏 ≀ exp βˆ’ 𝑏2 2π‘œ

∎

slide-10
SLIDE 10

The General Setting

Shortcomings

  • Required each π‘Œπ‘— to be uniform on βˆ’1,1

Wider Framework

  • π‘ž1, π‘ž2, … , π‘žπ‘œ ∈ 0,1 , and π‘ž = π‘œβˆ’1 σ𝑗=1

π‘œ

π‘žπ‘—

  • π‘Œπ‘— independent with β„™ π‘Œπ‘— = 1 βˆ’ π‘žπ‘— = π‘žπ‘— and β„™ π‘Œ = βˆ’π‘žπ‘— = 1 βˆ’ π‘žπ‘—
  • π‘Œ = σ𝑗=1

π‘œ

π‘Œπ‘—

Theorem 5.1.4 (Asymmetric Chernoff Bound) Let 𝑏 > 0 and let π‘Œ and π‘ž be as above. Then β„™ π‘Œ ≀ βˆ’π‘ ≀ exp βˆ’

𝑏2 2π‘žπ‘œ

and β„™ π‘Œ β‰₯ 𝑏 ≀ exp

βˆ’π‘2 2π‘žπ‘œ + 𝑏3 2 π‘žπ‘œ 2 .

slide-11
SLIDE 11

Theorem 5.1.4 (Asymmetric Chernoff Bound) Let 𝑏 > 0 and let π‘Œ and π‘ž be as above. Then β„™ π‘Œ ≀ βˆ’π‘ ≀ exp

βˆ’π‘2 2π‘žπ‘œ

and β„™ π‘Œ β‰₯ 𝑏 ≀ exp

βˆ’π‘2 2π‘žπ‘œ + 𝑏3 2 π‘žπ‘œ 2 .

An Asymmetric Chernoff Bound

Special case

  • π‘žπ‘— = π‘ž for all 𝑗 β‡’ π‘Œ + π‘œπ‘ž ~ Bin π‘œ, π‘ž
  • β‡’ β„™ Bin π‘œ, π‘ž βˆ’ π‘œπ‘ž β‰₯ 𝑏 ≀ 2 exp

βˆ’π‘2 2π‘žπ‘œ + 𝑏3 2 π‘žπ‘œ 2

Corollary 5.1.5 For every 𝜁 > 0 there is some π‘‘πœ > 0 such that, if 𝑍 is the sum of mutually independent indicator random variables and 𝜈 = 𝔽 𝑍 , then β„™ 𝑍 βˆ’ 𝜈 β‰₯ 𝜁𝜈 ≀ 2 exp βˆ’π‘‘πœπœˆ .

slide-12
SLIDE 12

Any questions?

slide-13
SLIDE 13

Β§2 Returning to Ramsey

Chapter 5: Concentration The Probabilistic Method

slide-14
SLIDE 14

The Story So Far

Goal

  • Determine the order of magnitude of 𝑆(3, 𝑙)

Upper bound

  • ErdΕ‘s-Szekeres (1935): 𝑆 3, 𝑙 ≀

𝑙+1 2

= O 𝑙2

Lower bounds

  • First moment, Mantel: 𝑆 3, 𝑙 = Ξ©(𝑙)
  • Alterations:

𝑆 3, 𝑙 = Ξ©

𝑙 log 𝑙 3/2

  • LovΓ‘sz Local Lemma:

𝑆 3, 𝑙 = Ξ©

𝑙 log 𝑙 2

slide-15
SLIDE 15

Alterations Revisited

Proof sketch

  • Take 𝐻 ∼ 𝐻 π‘œ, π‘ž
  • Remove one vertex from each triangle, independent set of size 𝑙
  • Resulting graph 𝐻′ is Ramsey
  • First moment β‡’ with positive probability 𝐻′ has many vertices

Optimisation

  • Largest right-hand side can be is O

𝑙 log 𝑙 3/2

Theorem 2.1.2 (β„“ = 3) For every π‘œ, 𝑙 ∈ β„• and π‘ž ∈ [0,1], we have 𝑆 3, 𝑙 > π‘œ βˆ’ π‘œ 3 π‘ž3 βˆ’ π‘œ 𝑙 1 βˆ’ π‘ž

𝑙 2 .

slide-16
SLIDE 16

Alternative Alterations

Vertex removal

  • Wasteful operation
  • To fix a single, small triangle, we make Ξ©(π‘œ) changes to the graph
  • Shrinks our resulting Ramsey graph too much

Edge removal

  • More efficient fix
  • To fix a triangle, need only remove a single edge
  • Problematic
  • Being triangle-free and having small independence numbers are in conflict
  • Need to ensure we can destroy all triangles without creating large independent sets
  • A new hope
  • Can our more advanced probabilistic tools help?
slide-17
SLIDE 17

Plan of Attack

Detriangulation

  • Need to remove at least one edge from each triangle
  • Let 𝒰 be a maximal set of edge-disjoint triangles in 𝐻
  • If π‘ˆ is a triangle in 𝐻, maximality β‡’ must share an edge with some π‘ˆβ€² ∈ 𝒰
  • Remove all edges of all triangles in 𝒰
  • Removes 3 𝒰 edges
  • Need to remove at least 𝒰 edges

Independent sets

  • Cannot let a set 𝑇 of 𝑙 vertices become independent
  • Would help if 𝐻 𝑇 had many edges to begin with
  • Expect to see 𝑙

2 π‘ž edges

  • Chernoff β‡’ very unlikely to see many fewer
  • Can afford a union bound over all such sets 𝑇
slide-18
SLIDE 18

Theorem 5.1.4 (Asymmetric Chernoff Bound) Let 𝑏 > 0 and let π‘Œ and π‘ž be as before. Then β„™ π‘Œ ≀ βˆ’π‘ ≀ exp

βˆ’π‘2 2π‘žπ‘œ .

Local Edge Distribution

Local edge counts

  • Fix a set 𝑇 of 𝑙 vertices
  • 𝑓 𝐻 𝑇

∼ Bin

𝑙 2 , π‘ž

  • Expect 𝑙

2 π‘ž edges, how likely are we to see at least half of that?

Applying Chernoff

  • Set π‘Œ = Bin

𝑙 2 , π‘ž βˆ’ 𝑙 2 π‘ž, and let 𝑏 = 1 2 𝑙 2 π‘ž

  • β‡’ β„™ 𝑓 𝐻 𝑇

≀

1 2 𝑙 2 π‘ž ≀ exp βˆ’ 1 8 𝑙 2 π‘ž

slide-19
SLIDE 19

Local Properties Globally

Recall

  • β„™ 𝑓 𝐻 𝑇

≀

1 2 𝑙 2 π‘ž ≀ exp βˆ’ 1 8 𝑙 2 π‘ž

Union bound

  • We need every 𝑙-set to have many edges
  • Apply a union bound over choice of 𝑇
  • β„™ βˆƒπ‘‡: 𝑓 𝐻 𝑇

≀

1 2 𝑙 2 π‘ž ≀ π‘œ 𝑙 exp βˆ’ 1 8 𝑙 2 π‘ž ≀ exp 𝑙 ln π‘œ βˆ’ 1 8 𝑙 2 π‘ž

Setting parameters

  • Small if 𝑙 ln π‘œ ≀

1 10 𝑙 2 π‘ž, say

  • ⇔ π‘ž β‰₯

20 ln π‘œ π‘™βˆ’1

  • To avoid too many triangles, take equality above
  • Then with high probability each 𝑙-set spans at least

1 2 𝑙 2 π‘ž = 5𝑙 ln π‘œ edges

slide-20
SLIDE 20

A Tangle of Triangles

Recall

  • Setting π‘ž =

20 ln π‘œ π‘™βˆ’1 β‡’ almost surely, every 𝑙-set has at least 5𝑙 ln π‘œ edges

New independent sets

  • Remove all edges from a maximal set 𝒰 of edge-disjoint triangles
  • Need to avoid creating an independent set of 𝑙 vertices
  • Fix a set 𝑇 of 𝑙 vertices

How many edges do we lose?

  • Only triangles with an edge in 𝑇 are relevant
  • Number of potential such triangles:
  • 𝑙

3 + 𝑙 2

π‘œ βˆ’ 𝑙

  • Expected number of relevant triangles
  • 𝑙

3 + 𝑙 2

π‘œ βˆ’ 𝑙 π‘ž3 β‰ˆ

4000 3

ln3 π‘œ + 4000

π‘œβˆ’π‘™ 𝑙 ln3 π‘œ β‰ˆ 4000 π‘œ 𝑙 ln3 π‘œ

slide-21
SLIDE 21

Accounting for Triangles

Recall

  • With high probability, each 𝑙-set 𝑇 spans at least 5𝑙 ln π‘œ edges
  • Expect there to be at most 4000

π‘œ 𝑙 ln3 π‘œ triangles with an edge in 𝑇

Setting more parameters

  • In order to ensure 𝑇 does not become independent, need

π‘œ 𝑙 ln3 π‘œ ≀ 𝑑𝑙 ln π‘œ

  • 𝑑 > 0 some small constant
  • Solving: π‘œ ≀ 𝑑

𝑙 ln π‘œ 2

= 𝑑′

𝑙 log 𝑙 2

Large deviations

  • Need to ensure that no set 𝑇 sees too many triangles
  • Union bound over π‘œ

𝑙 many sets

  • β‡’ need the probability that we get more triangles than expected to be small
slide-22
SLIDE 22

Lemma 5.2.1 (ErdΕ‘s-Tetali, 1990) Let 𝐹1, 𝐹2, … , 𝐹𝑛 be a collection of events and set 𝜈 = σ𝑗=1

𝑛 β„™ 𝐹𝑗 . For

any 𝑑, β„™ 𝐹𝑗1 ∩ 𝐹𝑗2 ∩ β‹― ∩ 𝐹𝑗𝑑 for some independent 𝐹𝑗1, 𝐹𝑗2, … , 𝐹𝑗𝑑 ≀ πœˆπ‘‘ 𝑑! .

Too Many Triangles

Concentration inequalities

  • Chernoff: probability of seeing too many triangles is exponentially small
  • Problem: indicator variables for triangles are not independent
  • Chebyshev: error probabilities only polynomially small
  • Not enough to make up for π‘œ

𝑙 summands in union bound

Saving grace

  • We only remove edges of triangles in 𝒰, edge-disjoint set of triangles
slide-23
SLIDE 23

The ErdΕ‘s-Tetali Lemma

Proof

  • Take a union bound over all such 𝑑-sets of events
  • β„™ 𝐹𝑗1 ∩ β‹― ∩ 𝐹𝑗𝑑 for some independent events

≀ Οƒ 𝑗1,…,𝑗𝑑 ind β„™ 𝐹𝑗1 ∩ β‹― ∩ 𝐹𝑗𝑑 =

1 𝑑! Οƒ 𝑗1,…,𝑗𝑑 ind β„™ 𝐹𝑗1 ∩ β‹― ∩ 𝐹𝑗𝑑

=

1 𝑑! Οƒ 𝑗1,…,𝑗𝑑 ind Ο‚π‘˜=1 𝑑

β„™ πΉπ‘—π‘˜ ≀

1 𝑑! Οƒ 𝑗1,…,𝑗𝑑 ∈ 𝑛 𝑑 Ο‚π‘˜=1 𝑑

β„™ πΉπ‘—π‘˜ =

1 𝑑! Οƒπ‘—βˆˆ 𝑛 β„™ 𝐹𝑗 𝑑

=

πœˆπ‘‘ 𝑑!

∎

Lemma 5.2.1 (ErdΕ‘s-Tetali, 1990) Let 𝐹1, 𝐹2, … , 𝐹𝑛 be a collection of events and set 𝜈 = σ𝑗=1

𝑛 β„™ 𝐹𝑗 . For

any 𝑑, β„™ 𝐹𝑗1 ∩ 𝐹𝑗2 ∩ β‹― ∩ 𝐹𝑗𝑑 for some independent 𝐹𝑗1, 𝐹𝑗2, … , 𝐹𝑗𝑑 ≀ πœˆπ‘‘ 𝑑! .

slide-24
SLIDE 24

Handling Triangle Errors

Recall

  • With high probability, each 𝑙-set 𝑇 has at least 5𝑙 ln π‘œ edges
  • Expected number of triangles with an edge in 𝑇 at most 𝑑𝑙 ln π‘œ for small 𝑑

ErdΕ‘s-Tetali

  • Events 𝐹𝑗: 𝑗th triangle meeting 𝑇 is present in 𝐻 π‘œ, π‘ž
  • 𝜈 ≀ 𝑑𝑙 ln π‘œ
  • Let 𝑑 = 𝑙 ln π‘œ
  • Lemma 5.2.1 β‡’ β„™ 𝑇 sees edges of 𝑑 disjoint triangles ≀

πœˆπ‘‘ 𝑑!

Calculation

  • Stirling: 𝑑! β‰₯

𝑑 𝑓 𝑑

  • β‡’

πœˆπ‘‘ 𝑑! ≀ πœˆπ‘“ 𝑑 𝑑

≀ 𝑑𝑓 𝑙 ln π‘œ < π‘œβˆ’π‘™ if 𝑑 < π‘“βˆ’2

slide-25
SLIDE 25

Theorem 5.2.2 (ErdΕ‘s, 1961; Krivelevich, 1995) As 𝑙 β†’ ∞, 𝑆 3, 𝑙 = Ξ©

𝑙 log 𝑙 2

.

Completing the Proof

Union bound

  • Union bound over all n

𝑙 < π‘œπ‘™ sets β‡’ with high probability, every 𝑙-set:

  • Spans at least 5𝑙 ln π‘œ edges and meets at most 𝑙 ln π‘œ edge-disjoint triangles

Alteration

  • Given 𝐻 ∼ 𝐻 π‘œ, π‘ž , where π‘œ = 𝑑′

𝑙 log 𝑙 2

and π‘ž =

20 ln π‘œ π‘™βˆ’1

  • Let 𝒰 be a maximal set of edge-disjoint triangles, and remove all edges in 𝒰
  • Each 𝑙-set loses at most 3𝑙 ln π‘œ edges β‡’ doesn’t become independent
  • Resulting graph is therefore Ramsey.

∎

slide-26
SLIDE 26

Theorem 1.5.5 (ErdΕ‘s-Szekeres, 1935) For all β„“, 𝑙 ∈ β„•, 𝑆 β„“, 𝑙 ≀ β„“ + 𝑙 βˆ’ 2 β„“ βˆ’ 1 = 𝑃 π‘™β„“βˆ’1 . In particular, 𝑆 3, 𝑙 = 𝑃(𝑙2).

Closing In

Lower bounds

  • Edge-alteration gave same bound as LovΓ‘sz Local Lemma
  • 𝑆 3, 𝑙 = Ξ©

𝑙 log 𝑙 2

  • Could this be the truth? What can we say in the other direction?

Narrowing the gap

  • Left with a log2 𝑙 gap to close
slide-27
SLIDE 27

Independent Sets in Triangle-Free Graphs

Proof

  • Key observation: 𝐻 triangle-free β‡’ every neighbourhood is independent
  • ∴ if 𝐻 has a vertex of degree π‘œ βˆ’ 1, we are done
  • Otherwise Ξ” 𝐻 <

π‘œ βˆ’ 1

  • Greedy algorithm:
  • 𝛽 𝐻 β‰₯

π‘œ Ξ” 𝐻 +1 β‰₯

π‘œ ∎

Ramsey numbers

  • Implies 𝑆 3, 𝑙 = 𝑃 𝑙2

Proposition 5.2.3 If 𝐻 is an π‘œ-vertex triangle-free graph, 𝛽 𝐻 β‰₯ π‘œ βˆ’ 1.

slide-28
SLIDE 28

Theorem 5.2.4 (Ajtai, KomlΓ³s, SzemerΓ©di, 1980; Shearer, 1995) If 𝐻 is an π‘œ-vertex triangle-free graph with maximum degree Ξ”, then 𝛽 𝐻 β‰₯ π‘œ log Ξ” 8Ξ” .

Room for Improvement

Greedy algorithm

  • Order vertices arbitrarily
  • Add first vertex 𝑀 to independent set
  • Remove all its ≀ Ξ” neighbours, and repeat
  • Bound is sharp only if 𝑀 never has any neighbours previously removed
  • Only true for disjoint union of cliques
  • β‡’ cannot be sharp for triangle-free graphs
slide-29
SLIDE 29

An Improved Upper Bound

Proof

  • Let π‘œ =

8𝑙2 log 𝑙 and let 𝐻 be an π‘œ-vertex triangle-free graph

  • If Ξ” 𝐻 β‰₯ 𝑙
  • Let 𝑀 be a vertex of maximum degree
  • 𝑂 𝑀 is an independent set of size β‰₯ 𝑙
  • If Ξ” 𝐻 < 𝑙
  • Theorem 5.2.4 β‡’ 𝛽 𝐻 β‰₯

π‘œ log Ξ” 8Ξ”

β‰₯

π‘œ log 𝑙 8𝑙

= 𝑙 ∎

Corollary 5.2.5 As 𝑙 β†’ ∞, 𝑆 3, 𝑙 ≀

8𝑙2 log 𝑙.

slide-30
SLIDE 30

The Big Picture

Randomness

  • We show that a random independent set 𝐽 of 𝐻 has this size
  • If we let 𝑍

𝑀 = 1 π‘€βˆˆπ½ , then 𝐽 = σ𝑀 𝑍 𝑀

  • Would suffice to compute 𝔽 𝐽

= σ𝑀 𝔽[𝑍

𝑀] = σ𝑀 β„™ 𝑀 ∈ 𝐽

  • Computing β„™(𝑀 ∈ 𝐽) not straightforward – depends on neighbourhood

Neighbourhoods

  • How does 𝐽 meet the neighbourhood 𝑂 𝑀 ?
  • If 𝑀 ∈ 𝐽:
  • Must have 𝐽 ∩ 𝑂 𝑀 = βˆ…
  • If 𝑀 βˆ‰ 𝐽:
  • Can have 𝐽 ∩ 𝑂 𝑀 β‰  βˆ…
  • Since 𝑂(𝑀) is independent, intersection can be arbitrary
  • β‡’ might expect intersection to be large
slide-31
SLIDE 31

Lemma 5.2.6 If Ξ” β‰₯ 16, we have 𝔽 π‘Œπ‘€ β‰₯

log Ξ” 4

for every 𝑀.

New Random Variables

Local variables

  • Define new variables to account for local information
  • Let π‘Œπ‘€ = Ξ” β‹… 1 π‘€βˆˆπ½ + 𝐽 ∩ 𝑂 𝑀
  • Heuristic justification
  • Regularise contribution of 𝑀
  • When 𝑀 ∈ 𝐽, have π‘Œπ‘€ = Ξ”
  • When 𝑀 βˆ‰ 𝐽, can still have π‘Œπ‘€ = Θ Ξ”
  • Easier to get useful bounds on π‘Œπ‘€
slide-32
SLIDE 32

Deducing the Theorem

Proof

  • If Ξ” ≀ 15, done by 𝛽 𝐻 β‰₯

π‘œ Ξ”+1

  • Otherwise, let 𝐽 be a uniformly random independent set of 𝐻
  • For each vertex 𝑀, let π‘Œπ‘€ = Ξ” β‹… 1 π‘€βˆˆπ½ + 𝐽 ∩ 𝑂 𝑀
  • Let π‘Œ = σ𝑀 π‘Œπ‘€
  • Observe: π‘Œ ≀ 2Ξ” 𝐽
  • Each 𝑀 ∈ 𝐽 contributes at most 2Ξ”: Ξ” via π‘Œπ‘€, and 1 via π‘Œπ‘£ for each neighbour 𝑣
  • Lemma 5.2.6 β‡’ 𝔽 π‘Œ β‰₯

π‘œ log Ξ” 4

∎

Theorem 5.2.4 (Ajtai, KomlΓ³s, SzemerΓ©di, 1980; Shearer, 1995) If 𝐻 is an π‘œ-vertex triangle-free graph with maximum degree Ξ”, then 𝛽 𝐻 β‰₯ π‘œ log Ξ” 8Ξ” .

slide-33
SLIDE 33

Proving the Lemma

Proof

  • π‘Œπ‘€ = Ξ” β‹… 1 π‘€βˆˆπ½ + 𝐽 ∩ 𝑂 𝑀
  • Which 𝑣 ∈ 𝑂 𝑀 could be in 𝐽?
  • Need to know 𝐽 ∩ 𝑂 𝑂 𝑀
  • Idea: condition on how 𝐽 meets the rest of the graph
  • Let 𝐼 = 𝐻 βˆ–

𝑀 βˆͺ 𝑂 𝑀

  • 𝔽 π‘Œπ‘€ = 𝔽 𝔽 π‘Œπ‘€ 𝐽 ∩ π‘Š 𝐼 = 𝐾
  • Suffices to show 𝔽 π‘Œπ‘€ 𝐽 ∩ π‘Š 𝐼 = 𝐾 β‰₯

log Ξ” 4

for every independent 𝐾 in 𝐼

Lemma 5.2.5 If Ξ” β‰₯ 16, we have 𝔽 π‘Œπ‘€ β‰₯

log Ξ” 4

for every 𝑀.

slide-34
SLIDE 34

Extending Independent Sets

Goal

  • 𝔽 π‘Œπ‘€ 𝐽 ∩ π‘Š 𝐼 = 𝐾 β‰₯

log Ξ” 4

Available neighbours

  • Let 𝐡 = 𝑂 𝑀 βˆ– 𝑂 𝐾
  • Those neighbours of 𝑀 that could be added to 𝐾
  • Let 𝑏 = 𝐡

Independent extensions

  • Two types of extensions of 𝐾 to 𝐽:
  • 𝐽 = 𝐾 βˆͺ 𝑀
  • 𝐽 = 𝐾 βˆͺ 𝑇, some 𝑇 βŠ† 𝐡
  • 𝐽 is chosen uniformly at random from 2𝑏 + 1 optoins
slide-35
SLIDE 35

Computing Conditional Expectations

Recall

  • π‘Œπ‘€ = Ξ” β‹… 1 π‘€βˆˆπ½ + 𝐽 ∩ 𝑂 𝑀
  • Want to show 𝔽 π‘Œπ‘€ 𝐽 ∩ π‘Š 𝐼 = 𝐾 β‰₯

log Ξ” 4

Conditional Expectation

  • Case: 𝑀 ∈ 𝐽
  • Probability:

1 2𝑏+1

  • π‘Œπ‘€ = Ξ”
  • Case: 𝑀 βˆ‰ 𝐽
  • Probability:

2𝑏 2𝑏+1

  • 𝔽 π‘Œπ‘€ 𝑀 βˆ‰ 𝐽, 𝐽 ∩ π‘Š 𝐼 = 𝐾 = 𝔽 𝑇

=

𝑏 2

  • β‡’ 𝔽 π‘Œπ‘€ 𝐽 ∩ π‘Š 𝐼 = 𝐾 =

Ξ” 2𝑏+1 + a2aβˆ’1 2a+1

slide-36
SLIDE 36

Concluding Calculations

Recall

  • 𝔽 π‘Œπ‘€ 𝐽 ∩ π‘Š 𝐼 = 𝐾 =

Ξ” 2𝑏+1 + 𝑏2π‘βˆ’1 2𝑏+1

  • Want to show 𝔽 π‘Œπ‘€ 𝐽 ∩ π‘Š 𝐼 = 𝐾 β‰₯

log Ξ” 4

Contradiction

  • If not,

log Ξ” 4

>

Ξ” 2𝑏+1 + 𝑏2π‘βˆ’1 2𝑏+1

  • β‡’ 2𝑏 + 1 log Ξ” > 4Ξ” + 2𝑏2𝑏
  • β‡’ log Ξ” βˆ’ 2𝑏 2𝑏 > 4Ξ” βˆ’ log Ξ”
  • Also β‡’ 𝑏 β‰₯ 1
  • Must have 2𝑏 < log Ξ”
  • β‡’ 2𝑏 <

Ξ”

  • β‡’ log Ξ” βˆ’ 2

Ξ” > 4Ξ” βˆ’ log Ξ”

  • False for Ξ” β‰₯ 16

∎

slide-37
SLIDE 37

Theorem 5.2.7 (Kim, 1995) As 𝑙 β†’ ∞, 𝑆 3, 𝑙 = Ξ©

𝑙2 log 𝑙 .

Epilogue

What we know

  • Ξ©

𝑙2 log2 𝑙 = 𝑆 3, 𝑙 = 𝑃 𝑙2 log 𝑙

Remarks

  • Kim’s proof a β€œtour de force”
  • Lower bound recently sharpened via analysis of triangle-free process
  • Asymptotics of 𝑆(𝑑, 𝑙), 𝑑 β‰₯ 4 fixed and 𝑙 β†’ ∞, unknown
slide-38
SLIDE 38

Any questions?

slide-39
SLIDE 39

Β§3 Hamiltonicity

Chapter 5: Concentration The Probabilistic Method

slide-40
SLIDE 40

Definition 5.3.1 A Hamiltonian cycle in a graph 𝐻 is a cycle passing through every vertex

  • f 𝐻. A graph is called Hamiltonian if it contains a Hamiltonian cycle.

Setting the Scene

Questions

  • Are there easy ways to recognise Hamiltonian graphs?
  • What happens for the average graph?

Theorem 5.3.2 (Karp, 1972) Deciding whether a graph is Hamiltonian is NP-Complete.

slide-41
SLIDE 41

Theorem 5.3.3 (Dirac, 1952) Every π‘œ-vertex graph 𝐻 with minimum degree πœ€ 𝐻 β‰₯

π‘œ 2 is Hamiltonian.

A Sufficient Condition

Optimal bound

  • π‘œ even: two disjoint cliques
  • π‘œ odd: two cliques sharing one vertex

Corollary 5.3.4 For every 𝜁 > 0 and π‘ž β‰₯

1 2 + 𝜁 π‘œ, 𝐻 π‘œ, π‘ž is Hamiltonian w.h.p.

slide-42
SLIDE 42

Proposition 5.3.5 For every 𝜁 > 0 and π‘ž ≀

1βˆ’πœ log π‘œ π‘œ

, 𝐻 π‘œ, π‘ž is w.h.p. not Hamiltonian.

Threshold Lower Bound

First moment

  • There are

π‘œβˆ’1 ! 2

=

π‘œ 1+𝑝 1 𝑓 π‘œ

possible Hamiltonian cycles

  • Each appears in 𝐻 π‘œ, π‘ž with probability π‘žπ‘œ
  • β‡’ expected number of cycles is

π‘œπ‘ž 1+𝑝 1 𝑓 π‘œ

  • β‡’ if π‘ž ≀

π‘“βˆ’πœ π‘œ , then 𝐻 π‘œ, π‘ž has no Hamiltonian cycles w.h.p.

Connectivity

  • 𝐻(π‘œ, π‘ž) Hamiltonian β‡’ 𝐻 π‘œ, π‘ž connected
slide-43
SLIDE 43

Dirac’s Theorem

Proof

  • 𝐻 is connected
  • If not, smaller component would not support minimum degree
  • Let 𝑄 = 𝑀0𝑀1𝑀2 … 𝑀𝑒 be a longest path
  • 𝑂

𝑀0, 𝑀𝑒 βŠ† 𝑄, as otherwise path could be extended

  • Pigeonhole: βˆƒπ‘— such that 𝑀𝑗, 𝑀𝑒 , 𝑀0, 𝑀𝑗+1 ∈ 𝐹 𝐻
  • We have a cycle 𝐷 = 𝑀0𝑀1𝑀2 … π‘€π‘—π‘€π‘’π‘€π‘’βˆ’1π‘€π‘’βˆ’2 … 𝑀𝑗+1𝑀0
  • If 𝑒 = π‘œ, this is a Hamiltonian cycle
  • If 𝑒 < π‘œ, connectivity β‡’ edge from 𝐷 to 𝐻 βˆ– 𝐷
  • Gives a longer path, contradiction.

∎

Theorem 5.3.3 (Dirac, 1952) Every π‘œ-vertex graph 𝐻 with minimum degree πœ€ 𝐻 β‰₯

π‘œ 2 is Hamiltonian.

slide-44
SLIDE 44

Dirac’s Algorithm

More than existential

  • Proof shows us how to find a Hamiltonian cycle
  • Start with any path
  • If there are edges out from the endpoints, extend path
  • Otherwise by pigeonhole turn path into cycle
  • Use external edge to extend path
  • Repeat until cycle is Hamiltonian

Random setting

  • Extremal problem:
  • Need to assume worst-case graph
  • Used large degree, pigeonhole to rotate path into cycle
  • Can we use properties of 𝐻 π‘œ, π‘ž to do this more efficiently?
slide-45
SLIDE 45

Definition 5.3.6 (Booster) Given a graph 𝐻, a booster is a potential edge 𝑓 such that 𝐻 βˆͺ 𝑓 contains a longer path or a Hamiltonian cycle.

PΓ³sa Rotations

Goal

  • Given path 𝑄 = 𝑀0𝑀1 … 𝑀𝑒 in a graph 𝐻
  • Want to find a longer path or a Hamiltonian cycle

Rotations

  • If 𝐻 is connected, the pair 𝑀0, 𝑀𝑒 is a booster
  • Suppose 𝑀𝑗, 𝑀𝑒 ∈ 𝐹 𝐻 , 1 ≀ 𝑗 ≀ 𝑒 βˆ’ 2
  • Rotation along 𝑀𝑗, 𝑀𝑒 : 𝑄′ = 𝑀0𝑀1 … π‘€π‘—π‘€π‘’π‘€π‘’βˆ’1 … 𝑀𝑗+1 also a path of length 𝑒
  • β‡’ the pair 𝑀0, 𝑀𝑗+1 is also a booster
slide-46
SLIDE 46

Endpoint Neighbourhoods

Proof

  • After rotating along {𝑀𝑗, 𝑀𝑒}, only 𝑀𝑗, 𝑀𝑒 get new neighbours on the path
  • Let 𝑀 ∈ 𝑆
  • Rotate to path 𝑄′ with 𝑀 as an endpoint
  • Let 𝑧 ∈ 𝑂 𝑀 βˆ– 𝑆
  • If 𝑧 βˆ‰ π‘Š(𝑄), extend 𝑄′ to 𝑧 β‡’ longer path than 𝑄
  • If 𝑧 ∈ π‘Š 𝑄 , rotate 𝑄′ along the edge 𝑀, 𝑧
  • β‡’ a neighbour 𝑦 of 𝑧 on 𝑄′ is an endpoint of the new path, so 𝑦 ∈ 𝑆
  • If 𝑦 also a neighbour of 𝑧 on 𝑄, then 𝑧 ∈ 𝑂𝑄(𝑆)
  • Otherwise must have rotated along an edge incident to 𝑧 β‡’ 𝑧 ∈ 𝑂𝑄 𝑆

∎

Lemma 5.3.7 Let 𝑄 = 𝑀0𝑀1 … 𝑀𝑒 be a longest path in a graph 𝐻, and let 𝑆 be the set

  • f endpoints reachable from 𝑀0 by sequences of rotations. Then

𝑂𝐻 𝑆 βŠ† 𝑂𝑄 𝑆 .

slide-47
SLIDE 47

Corollary 5.3.8 Let 𝑄 be a longest path in 𝐻, and let 𝑆 be the set of endpoints following sequences of rotations. Then 𝑂𝐻 𝑆 ≀ 2 𝑆 βˆ’ 1.

Expanders

Proof

  • Lemma 5.3.7 β‡’ 𝑂𝐻 𝑆 βŠ† 𝑂𝑄 𝑆
  • Each vertex in 𝑆 contributes at most two neighbours to 𝑂𝑄 𝑆
  • Final vertex 𝑀𝑒 only contributes one
  • β‡’ 𝑂𝑄 𝑆

≀ 2 𝑆 βˆ’ 1 ∎

Definition 5.3.9 (Expander) A graph 𝐻 is a 𝑙, 2 -expander if, for every 𝑇 βŠ† π‘Š 𝐻 with 𝑇 ≀ 𝑙, we have 𝑂𝐻 𝑇 β‰₯ 2 𝑇 .

slide-48
SLIDE 48

Expanders Have Many Boosters

Proof

  • If 𝐻 is Hamiltonian, every edge is a booster.
  • Otherwise let 𝑄 = 𝑀0𝑀1 … 𝑀𝑒 be a longest path
  • Fix 𝑀0, and let 𝑆0 be the endpoints after rotations
  • Corollary 5.3.8 β‡’ 𝑂𝐻 𝑆0

≀ 2 𝑆0 βˆ’ 1

  • 𝐻 a 𝑙, 2 -expander β‡’ 𝑆0 β‰₯ 𝑙 + 1
  • Given any 𝑧 ∈ 𝑆0, rotate to a 𝑀0-𝑧 path 𝑄′
  • Fix 𝑧, and let 𝑆𝑧 be the endpoints of paths from 𝑧 after rotating 𝑄′
  • Again, 𝑆𝑧 β‰₯ 𝑙 + 1
  • For each 𝑨 ∈ 𝑆𝑧, 𝑧, 𝑨 is a booster, counted at most twice

∎

Corollary 5.3.10 If 𝐻 is a connected (𝑙, 2)-expander, then 𝐻 has at least

1 2 𝑙2 boosters.

slide-49
SLIDE 49

Dirac’s Algorithm in Random Graphs

Assumptions

  • 𝐻 π‘œ, π‘ž is connected – know to be true for π‘ž β‰₯

1+𝜁 log π‘œ π‘œ

  • 𝐻 π‘œ, π‘ž is a 𝑙, 2 -expander for 𝑙 large

Rotation-Extension process

  • Start with a longest path 𝑄
  • Corollary 5.3.10 β‡’ gives rise to Ξ© 𝑙2 boosters
  • Each booster is an edge of 𝐻 π‘œ, π‘ž independently with probability π‘ž
  • β‡’ Probability none of the boosters appear is 1 βˆ’ π‘ž 𝑙2
  • β‡’ if π‘ž = πœ• π‘™βˆ’2 , then w.h.p. one of the boosters should be in 𝐻 π‘œ, π‘ž
  • Use it to extend path, repeat until Hamiltonian
slide-50
SLIDE 50

Multiple Exposures

Recall

  • Longest path gave rise to Ξ© 𝑙2 boosters
  • Want to show w.h.p. a booster appears in 𝐻 π‘œ, π‘ž

Problem

  • To find the boosters, we needed to expose edges in π‘Š 𝑄

2

  • Might already have found boosters are not edges
  • They do not appear independently with probability π‘ž

Solution

  • Split the random graph into independent subgraphs
  • Let π‘ž0, π‘Ÿ satisfy 1 βˆ’ π‘ž = 1 βˆ’ π‘ž0

1 βˆ’ π‘Ÿ

  • Then 𝐻 π‘œ, π‘ž ∼ 𝐻 π‘œ, π‘ž0 βˆͺ 𝐻 π‘œ, π‘Ÿ
  • Use 𝐻 π‘œ, π‘ž0 to obtain connectivity, expansion properties, find boosters
  • Use 𝐻 π‘œ, π‘Ÿ to show boosters appear in the random graph w.h.p.
slide-51
SLIDE 51

Random Graphs are Expanders

Proof

  • If not, there is some set 𝑇 of size 𝑑 ≔ 𝑇 ≀

π‘œ 6 such that 𝑂 𝑇

< 2𝑑

  • β‡’ βˆƒπ‘‹ βŠ‚ π‘Š 𝐻 βˆ– 𝑇, 𝑋 = 2𝑑, such that we have no edges from 𝑇 to π‘Š 𝐻 βˆ– 𝑇 βˆͺ 𝑋
  • Probability these edges are missing is 1 βˆ’ π‘ž 𝑑 π‘œβˆ’3𝑑 ≀ π‘“βˆ’π‘žπ‘‘ π‘œβˆ’3𝑑 ≀ π‘“βˆ’π‘žπ‘‘π‘œ/2
  • Count number of pairs 𝑇, 𝑋
  • π‘œ

𝑑

≀

π‘œπ‘“ 𝑑 𝑑

choices for 𝑇,

π‘œβˆ’π‘‘ 2𝑑

≀

π‘œ 2𝑑 ≀ π‘œπ‘“ 2𝑑 2𝑑

choices for 𝑋

  • Union bound
  • β„™ 𝐻 π‘œ, π‘ž bad ≀ σ𝑑=1

π‘œ 6

π‘œ3𝑓3 4𝑑3 π‘“βˆ’π‘žπ‘œ/2 𝑑

≀ σ𝑑=1

π‘œ 6

𝑓3 4 π‘œ 𝑑

= 𝑝 1 ∎

Lemma 5.3.11 If π‘ž β‰₯

7 log π‘œ π‘œ

, then 𝐻 π‘œ, π‘ž is w.h.p. an

π‘œ 6 , 2 -expander.

slide-52
SLIDE 52

The Hamiltonicity Threshold

Proof

  • Let π‘ž0 =

7 log π‘œ π‘œ

and π‘Ÿ =

73 log π‘œ π‘œ2

  • Let 𝐻0 ∼ 𝐻 π‘œ, π‘ž0 , and for 𝑗 ∈ [π‘œ], let 𝐻𝑗 ∼ 𝐻 π‘œ, π‘Ÿ be independent
  • If 𝐻 = 𝐻0 βˆͺ (βˆͺ𝑗 𝐻𝑗), then 𝐻 ∼ 𝐻 π‘œ, π‘ž for π‘ž = 1 βˆ’ 1 βˆ’ π‘ž0

1 βˆ’ π‘Ÿ π‘œ ≀

80 log π‘œ π‘œ2

  • Lemma 5.3.11 β‡’ 𝐻0 is w.h.p. a connected

π‘œ 6 , 2 -expander

  • Corollary 5.3.10 β‡’ any supergraph of 𝐻0 has at least

π‘œ2 72 boosters

  • β‡’ probability 𝐻𝑗 does not contain one of the boosters ≀ 1 βˆ’ π‘Ÿ

π‘œ2 72 ≀ 𝑓 βˆ’π‘Ÿπ‘œ2 72 = 𝑝

1 π‘œ

  • β‡’ Grow a longest path, using 𝐻𝑗 to find a booster in the 𝑗th step

∎

Theorem 5.3.12 (PΓ³sa, 1976) If π‘ž β‰₯

80 log π‘œ π‘œ

, then 𝐻 π‘œ, π‘ž is w.h.p. Hamiltonian.

slide-53
SLIDE 53

Theorem 5.3.13 (KomlΓ³s-SzemerΓ©di, 1983) For 𝜁 > 0 and π‘ž β‰₯

1+𝜁 log π‘œ π‘œ

, 𝐻 π‘œ, π‘ž is w.h.p. Hamiltonian.

Epilogue

  • Even sharper results were later proven

Theorem 5.3.14 (BollobΓ‘s, 1984; Ajtai-KomlΓ³s-SzemerΓ©di, 1985) In the random graph process, w.h.p. the graph becomes Hamiltonian precisely when the minimum degree is at least two.

  • Hamiltonicity displays a very sharp threshold
slide-54
SLIDE 54

Any questions?

slide-55
SLIDE 55

Β§4 Martingales

Chapter 5: Concentration The Probabilistic Method

slide-56
SLIDE 56

Threshold for Triangles

Triangular case

  • β„“ = 3: threshold for containing triangles is π‘œβˆ’1

Upper tail

  • When π‘ž ≫ π‘œβˆ’1, how unlikely is 𝐻 π‘œ, π‘ž to be triangle-free?
  • Proof of Theorem 3.3.1
  • Used Chebyshev’s Inequality
  • Gives polynomial error bounds

Theorem 3.3.1 For β„“ β‰₯ 2, the threshold for 𝐿ℓ βŠ† 𝐻 π‘œ, π‘ž is π‘ž0 π‘œ = π‘œβˆ’2/(β„“βˆ’1).

slide-57
SLIDE 57

Exponential Dreams

Indicator random variables

  • Let π‘Œ denote the number of triangles in 𝐻 ∼ 𝐻 π‘œ, π‘ž
  • Given π‘ˆ ∈

π‘œ 3 , let π‘Œπ‘ˆ be the indicator that 𝐻 π‘ˆ β‰… 𝐿3

  • Then β„™ π‘Œπ‘ˆ = 1 = π‘ž3
  • Also π‘Œ = Οƒπ‘ˆ π‘Œπ‘ˆ

Stronger concentration

  • Using Chernoff would give β„™ π‘Œ = 0 ≀ exp βˆ’

1 2 π‘œ 3 π‘ž3

  • Exponentially small error bound
  • Problem: summands π‘Œπ‘ˆ not independent
  • π‘Œπ‘ˆ, π‘Œπ‘ˆβ€² positively correlated when π‘ˆ ∩ π‘ˆβ€² = 2
slide-58
SLIDE 58

Lemma 5.4.1 There exists a family of

1 3 π‘œβˆ’1 2

pairwise edge-disjoint triangles in πΏπ‘œ.

Sparse Independence

Cheap fix

  • Restrict our attention to mutually independent events
  • Equivalently: consider a family of edge-disjoint triangles

Proof

  • Colour each triangle 𝑗, π‘˜, 𝑙 with the colour 𝑑 ≑ 𝑗 + π‘˜ + 𝑙 mod π‘œ
  • Each colour class is edge-disjoint
  • Given vertices 𝑗, π‘˜, third vertex 𝑙 ≑ 𝑑 βˆ’ 𝑗 βˆ’ π‘˜ determined
  • Large colour class
  • For some 𝑑, number of 𝑑-coloured triangles is at least

1 π‘œ π‘œ 3 = 1 3 π‘œβˆ’1 2

∎

slide-59
SLIDE 59

Don’t Let Your Dreams Be Dreams

Proof

  • Let 𝒰 be the collection of triangles from Lemma 5.4.1
  • If 𝐻 π‘œ, π‘ž is triangle-free, no triangle in 𝒰 appears
  • These appear independently
  • Probability none appear is 1 βˆ’ π‘ž3

𝒰 ≀ exp βˆ’ 𝒰 π‘ž3

∎

Good news

  • Exponential bound on error probability

Bad news

  • Exponent π‘œβˆ’1

2

π‘ž3 = Θ π‘œ2π‘ž3 is of lower order than expected

Corollary 5.4.2 𝐻 π‘œ, π‘ž is triangle-free with probability at most exp βˆ’

1 3 π‘œβˆ’1 2

π‘ž3 .

slide-60
SLIDE 60

Postmortem of a Proof

Improving the exponent

  • Need to consider all π‘œ

3 possible triangles

  • Dependencies are limited – can we recover Chernoff-type bounds?

Revisiting Chernoff

  • π‘‡π‘œ = σ𝑗=1

π‘œ

π‘Œπ‘—

  • Properties of π‘Œπ‘—:
  • Bounded, βˆ’1,1 -variables
  • 𝔽 π‘Œπ‘— = 0
  • π‘Œπ‘— mutually independent
  • Using independence:
  • Applied Markov to π‘“π‘‡π‘œ
  • Independence β‡’ 𝔽 π‘“π‘‡π‘œ = 𝔽 𝑓σ𝑗 π‘Œπ‘— = ς𝑗 𝔽 π‘“π‘Œπ‘—
slide-61
SLIDE 61

Definition 5.4.3 (Martingale) A martingale is a sequence π‘Ž0, π‘Ž1, … , π‘Žπ‘› of random variables such that, for each 1 ≀ 𝑗 ≀ 𝑛, we have 𝔽 π‘Žπ‘— π‘Ž

π‘˜: π‘˜ < 𝑗

= π‘Žπ‘—βˆ’1. Loosely speaking, given what has previously transpired, we expect nothing to change in the 𝑗th step.

Martingales

Conditional independence

  • What if the π‘Œπ‘— are not independent?
  • Recover independence by conditioning on previous variables
  • Product rule: 𝔽 𝑓σ𝑗 π‘Œπ‘— = ς𝑗 𝔽 π‘“π‘Œπ‘— π‘Œ

π‘˜: π‘˜ < 𝑗

  • β‡’ if π‘Œπ‘— π‘Œ

π‘˜: π‘˜ < 𝑗

has the right properties, can prove Chernoff-type bounds

slide-62
SLIDE 62

Martingales, Tame and Wild

Boring mathsy example

  • Let π‘Œπ‘— be independent and uniform on βˆ’1,1 , for 1 ≀ 𝑗 ≀ 𝑛
  • Let π‘Žπ‘— = Οƒπ‘˜β‰€π‘— π‘Œ

π‘˜

  • 𝔽 π‘Žπ‘— π‘Ž

π‘˜: π‘˜ < 𝑗

= 𝔽 π‘Žπ‘—βˆ’1 + π‘Œπ‘— π‘Ž

π‘˜: π‘˜ < 𝑗

= π‘Žπ‘—βˆ’1 + 𝔽 π‘Œπ‘—| π‘Ž

π‘˜: π‘˜ < 𝑗

  • 𝔽 π‘Œπ‘— π‘Ž

π‘˜: π‘˜ < 𝑗

= 𝔽 π‘Œπ‘— = 0

  • β‡’ Zi: 0 ≀ 𝑗 ≀ 𝑛 is a martingale

Fun real-world example

  • Gambling on (fair) coin tosses
  • π‘Žπ‘— = cumulative profit/loss after 𝑗th toss
  • Bet 𝑐𝑗 = 𝑐𝑗 π‘Ž0, π‘Ž1, … , π‘Žπ‘—βˆ’1 on the 𝑗th toss, depending on previous outcomes
  • 𝔽 π‘Žπ‘— π‘Ž

π‘˜: π‘˜ < 𝑗

=

1 2 Ziβˆ’1 + bi + 1 2 Ziβˆ’1 βˆ’ bi = Ziβˆ’1

  • β‡’ π‘Žπ‘—: 0 ≀ 𝑗 ≀ 𝑛 is a martingale

Disclaimer: gambling can be addictive and bad for your bank balance

slide-63
SLIDE 63

Martingale Concentration

Proof

  • Set π‘Œπ‘— = π‘Žπ‘— βˆ’ π‘Žπ‘—βˆ’1
  • β‡’ π‘Œπ‘— ≀ 1 and π‘Žπ‘› = σ𝑗=1

𝑛 π‘Œπ‘—

  • Martingale β‡’ 𝔽 π‘Œπ‘— π‘Ž

π‘˜: π‘˜ < 𝑗

= 0

  • For any πœ‡ > 0, we have π‘Žπ‘› β‰₯ 𝑏 ⇔ eπœ‡Zm β‰₯ π‘“πœ‡π‘
  • β„™ π‘“πœ‡π‘Žπ‘› β‰₯ π‘“πœ‡π‘ ≀ 𝔽 π‘“πœ‡π‘Žπ‘› π‘“βˆ’πœ‡π‘
  • 𝔽 π‘“πœ‡π‘Žπ‘› = ς𝑗=1

𝑛 𝔽 π‘“πœ‡π‘Œπ‘— π‘Ž π‘˜: π‘˜ < 𝑗

Theorem 5.4.4 (Azuma’s Inequality) Let π‘Ž0, π‘Ž1, … , π‘Žπ‘› be a martingale with π‘Ž0 = 0 and π‘Žπ‘— βˆ’ π‘Žπ‘—βˆ’1 ≀ 1 for all 1 ≀ 𝑗 ≀ 𝑛. Then, for any a > 0, we have β„™ π‘Žπ‘› β‰₯ 𝑏 ≀ exp βˆ’π‘2/2𝑛 .

slide-64
SLIDE 64

A Little Calculus

Proof

  • Let 𝑔 𝑧 =

π‘“πœ‡+π‘“βˆ’πœ‡ 2

+

π‘“πœ‡βˆ’π‘“βˆ’πœ‡ 2

𝑧 = π‘“πœ‡

1 2 + 𝑧 2 + π‘“βˆ’πœ‡ 1 2 βˆ’ 𝑧 2

  • β‡’ 𝑔 represents chord between 𝑕 𝑧 = π‘“πœ‡π‘§ between 𝑧 = βˆ’1 and 𝑧 = 1
  • Convexity β‡’ 𝑕 𝑧 ≀ 𝑔 𝑧 for all 𝑧 ∈ βˆ’1,1
  • Thus 𝔽 π‘“πœ‡π‘ = 𝔽 𝑕 𝑍

≀ 𝔽 𝑔 𝑍 =

π‘“πœ‡+π‘“βˆ’πœ‡ 2

+

π‘“πœ‡βˆ’π‘“βˆ’πœ‡ 2

𝔽 𝑍 = cosh πœ‡ ∎

Lemma 5.4.5 If πœ‡ > 0 and 𝑍 is a random variable with 𝔽 𝑍 = 0 and 𝑍 ≀ 1, then 𝔽 π‘“πœ‡π‘ ≀ cosh πœ‡ .

slide-65
SLIDE 65

Completing the Proof

Proof (cont’d)

  • β„™ π‘“πœ‡π‘Žπ‘› β‰₯ π‘“πœ‡π‘ ≀ 𝔽 π‘“πœ‡π‘Žπ‘› π‘“βˆ’πœ‡π‘
  • 𝔽 π‘“πœ‡π‘Žπ‘› = ς𝑗=1

𝑛 𝔽 π‘“πœ‡π‘Œπ‘— π‘Ž π‘˜: π‘˜ < 𝑗

  • By Lemma 5.4.5, 𝔽 π‘“πœ‡π‘Œπ‘— π‘Ž

π‘˜: π‘˜ < 𝑗

≀ cosh πœ‡ ≀ π‘“πœ‡2/2

  • ∴ β„™ π‘Žπ‘› β‰₯ 𝑏 ≀ exp

πœ‡2𝑛 2 βˆ’ πœ‡π‘

  • Substitute πœ‡ =

𝑏 𝑛

∎

Theorem 5.4.4 (Azuma’s Inequality) Let π‘Ž0, π‘Ž1, … , π‘Žπ‘› be a martingale with π‘Ž0 = 0 and π‘Žπ‘— βˆ’ π‘Žπ‘—βˆ’1 ≀ 1 for all 1 ≀ 𝑗 ≀ 𝑛. Then, for any a > 0, we have β„™ π‘Žπ‘› β‰₯ 𝑏 ≀ exp βˆ’π‘2/2𝑛 .

slide-66
SLIDE 66

Graph Martingales

Upper tail for triangles

  • Sample 𝐻 ∼ 𝐻 π‘œ, π‘ž , π‘Œ = # triangles in 𝐻
  • π‘Œ = Οƒπ‘ˆβˆˆ

π‘œ 3 π‘Œπ‘ˆ, with π‘Œπ‘ˆ the indicator that 𝐻 π‘ˆ ≑ 𝐿3

Where is the martingale?

  • Natural candidate
  • Order sets π‘ˆ

1, π‘ˆ 2, … , π‘ˆ 𝑛

  • Let π‘Žπ‘— = Οƒπ‘˜β‰€π‘— π‘Œπ‘ˆπ‘˜
  • Problem
  • Positive correlation β‡’ cannot make 𝔽 π‘Œπ‘ˆπ‘— π‘Ž

π‘˜: π‘˜ < 𝑗

= 0 for all choices of π‘Ž

π‘˜

  • Solution
  • Reveal information about 𝐻 in stages
  • Let π‘Žπ‘— be the expected value of π‘Œ given the information after 𝑗 rounds
slide-67
SLIDE 67

The Doob Martingale

General framework

  • Sample 𝐻 ∼ 𝐻 π‘œ, π‘ž , interested in graph parameter 𝑔 𝐻 ∈ ℝ
  • Example: 𝑔 𝐻 = # triangles in 𝐻

Revealing 𝐻

  • Order the possible edges

π‘œ 2

= 𝑓1, 𝑓2, … , 𝑓𝑛 for 𝑛 =

π‘œ 2

  • Let 𝑇𝑗 = 𝑓

π‘˜: π‘˜ ≀ 𝑗

The martingale

  • π‘Žπ‘— = 𝔽 𝑔 𝐻 𝐹 𝐻 ∩ 𝑇𝑗 βˆ’ 𝔽 𝑔 𝐻
  • Expected value of parameter given the previously revealed edges
  • π‘Ž0 = 𝔽 𝑔 𝐻

βˆ’ 𝔽 𝑔 𝐻 = 0

  • π‘Žπ‘› = 𝔽 𝑔 𝐻 𝐹 𝐻 ∩

π‘œ 2

βˆ’ 𝔽 𝑔 𝐻 = 𝑔 𝐻 βˆ’ 𝔽 𝑔 𝐻

slide-68
SLIDE 68

A Small Example

Framework

  • 𝐻 ∼ 𝐻 3,

1 2 and 𝑔 𝐻 = πœ• 𝐻

𝐻 𝑔 𝐻 1 2 2 2 2 2 2 3 1.5 2 2 2.5 1.75 2.25 2

slide-69
SLIDE 69

Verifying Martingale-ness

Recall

  • 𝐻 ∼ 𝐻(π‘œ, π‘ž), and we are exploring a graph parameter 𝑔(𝐻)
  • 𝑇𝑗 = 𝑓

π‘˜: π‘˜ ≀ 𝑗

  • π‘Žπ‘— = 𝔽 𝑔 𝐻 𝐹 𝐻 ∩ 𝑇𝑗

Conditional expectations

  • 𝔽 π‘Žπ‘—+1 𝐹 𝐻 ∩ 𝑇𝑗 = 𝔽 𝔽 𝑔 𝐻 𝐹 𝐻 ∩ 𝑇𝑗+1 𝐹 𝐻 ∩ 𝑇𝑗

= 𝔽 𝑔 𝐻 𝐹 𝐻 ∩ 𝑇𝑗 = π‘Žπ‘—

  • β‡’ this is a martingale
slide-70
SLIDE 70

Definition 5.4.6 (𝑑-Lipschitz) Let 𝑑 > 0. A graph parameter 𝑔 is 𝑑-(edge-)Lipschitz if, for any edge 𝑓, 𝑔 𝐻 βˆ’ 𝑔 𝐻 β–³ 𝑓 ≀ 𝑑.

Lipschitz Properties

Bounded differences

  • To apply Azuma’s Inequality, we need π‘Žπ‘— βˆ’ π‘Žπ‘—βˆ’1 ≀ 1 for all 𝑗
  • Intuitively: changing one edge should not change 𝑔 𝐻 much

Fact 5.4.7 Given a 𝑑-Lipschitz parameter 𝑔, we have π‘Žπ‘— βˆ’ π‘Žπ‘—βˆ’1 ≀ 1 for the normalised Doob martingale π‘Žπ‘— =

1 𝑑 𝔽 𝑔 𝐻 𝐹 𝐻 ∩ 𝑇𝑗 βˆ’ 𝔽 𝑔 𝐻

.

slide-71
SLIDE 71

Theorem 5.4.4 (Azuma’s Inequality) Let π‘Ž0, π‘Ž1, … , π‘Žπ‘› be a martingale with π‘Ž0 = 0 and π‘Žπ‘— βˆ’ π‘Žπ‘—βˆ’1 ≀ 1 for all 1 ≀ 𝑗 ≀ 𝑛. Then, for any a > 0, we have β„™ π‘Žπ‘› β‰₯ 𝑏 ≀ exp βˆ’π‘2/2𝑛 .

Summary

Remarks

  • Same bound holds for β„™ 𝑔 𝐻 ≀ 𝜈 βˆ’ 𝑏
  • Can also use a vertex-exposure martingale
  • π‘Žπ‘— is the expected value of 𝑔 𝐻 after exposing induced subgraph of 𝐻 on first 𝑗 vertices

Corollary 5.4.8 Let 𝑔 be a 𝑑-Lipschitz graph parameter, 𝐻 ∼ 𝐻 π‘œ, π‘ž , 𝜈 = 𝔽 𝑔 𝐻 , and 𝑏 > 0. Then β„™ 𝑔 𝐻 β‰₯ 𝜈 + 𝑏 ≀ exp βˆ’π‘2/π‘œ2𝑑2 .

slide-72
SLIDE 72

Any questions?

slide-73
SLIDE 73

Β§5 Triangle-free Graphs

Chapter 5: Concentration The Probabilistic Method

slide-74
SLIDE 74

Theorem 3.3.1 For β„“ β‰₯ 2, the threshold for 𝐿ℓ βŠ† 𝐻 π‘œ, π‘ž is π‘ž0 π‘œ = π‘œβˆ’2/(β„“βˆ’1).

A Quick Review

Triangle-freeness

  • β‡’ when π‘ž = πœ• π‘œβˆ’1 , β„™ 𝐿3 ⊈ 𝐻 π‘œ, π‘ž

= 𝑝 1

  • Error bound from Chebyshev β‡’ only polynomially small

Exponential error bounds

  • Sharper estimates by considering edge-disjoint triangles

Corollary 5.4.2 𝐻 π‘œ, π‘ž is triangle-free with probability at most exp βˆ’

1 3 π‘œβˆ’1 2

π‘ž3 .

slide-75
SLIDE 75

Corollary 5.4.8’ Let 𝑔 be a 𝑑-Lipschitz graph parameter, 𝐻 ∼ 𝐻 π‘œ, π‘ž , 𝜈 = 𝔽 𝑔 𝐻 , and 𝑏 > 0. Then β„™ 𝑔 𝐻 ≀ 𝜈 βˆ’ 𝑏 ≀ exp βˆ’π‘2/π‘œ2𝑑2 .

Applying Azuma

Counting triangles

  • 𝑔 𝐻 = # triangles in 𝐻
  • 𝜈 =

π‘œ 3 π‘ž3 = a

  • 𝑑 = π‘œ βˆ’ 2

Corollary 5.5.1 𝐻 π‘œ, π‘ž is triangle-free with probability at most exp

βˆ’ π‘œβˆ’1 2π‘ž6 36

.

slide-76
SLIDE 76

Corollary 5.5.2 Let 𝑔 be a 𝑑𝑀-vertex-Lipschitz parameter, 𝜈 = 𝔽 𝑔 𝐻 , and 𝑏 > 0. Then, for 𝐻 ∼ 𝐻(π‘œ, π‘ž), β„™ 𝑔 𝐻 ≀ 𝜈 βˆ’ 𝑏 ≀ exp βˆ’π‘2/2π‘œπ‘‘π‘€

2 .

Immeasurable Disappointment

Worse exponent

  • Exponent

1 36 π‘œ βˆ’ 1 2π‘ž6 is worse than the 1 3 π‘œβˆ’1 2

π‘ž3 from before

  • Problems
  • Long martingale, π‘œ

2 , and large Lipschitz constant, π‘œ βˆ’ 2

  • What if we apply vertex-exposure instead?

Vertex-exposure martingale

  • Shorter martingale, π‘œ, but worse Lipschitz constant, π‘œβˆ’1

2

  • Yields a worse exponent, Θ π‘œπ‘ž6
slide-77
SLIDE 77

Corollary 5.4.8 Let 𝑔 be a 𝑑-Lipschitz graph parameter, 𝐻 ∼ 𝐻 π‘œ, π‘ž , 𝜈 = 𝔽 𝑔 𝐻 , and 𝑏 > 0. Then β„™ 𝑔 𝐻 β‰₯ 𝜈 + 𝑏 ≀ exp βˆ’π‘2/π‘œ2𝑑2 .

A Judicious Parameter

Reducing the Lipschitz constant

  • Need to decrease the influence a single edge can have
  • Idea: edge-disjoint triangles
  • Let 𝑔 𝐻 = maximum number of pairwise edge-disjoint triangles

New bound

  • This choice of 𝑔 is 1-Lipschitz
  • Still have 𝐻 triangle-free ⇔ 𝑔 𝐻 = 0, so take 𝑏 = 𝔽 𝑔 𝐻
  • β‡’ β„™ 𝐻 triangleβˆ’free ≀ exp βˆ’π”½ 𝑔 𝐻

2/π‘œ2

  • How do we bound this expectation?
slide-78
SLIDE 78

Edge-Disjoint Triangles

Proof

  • Let 𝒰 be the collection of all π‘Œ triangles in 𝐻
  • Let β„›β€² βŠ† 𝒰 be a π‘Ÿ-random subcollection
  • Triangle π‘ˆ ∈ β„›β€² with probability π‘Ÿ, independent of all other triangles
  • Let 𝑍′ be the number of pairs of overlapping triangles in β„›β€²
  • From each pair in β„›β€² sharing an edge, remove one of the triangles
  • β‡’ resulting β„› βŠ† β„›β€² is pairwise edge-disjoint
  • 𝔽 β„›

β‰₯ 𝔽 β„›β€² βˆ’ 𝑍′ = π‘Ÿπ‘Œ βˆ’ π‘Ÿ2𝑍 ∎

Lemma 5.5.3 Let π‘Ÿ ∈ 0,1 , and let 𝐻 be a graph with π‘Œ triangles and 𝑍 pairs of triangles sharing an edge. Then 𝐻 has a collection of 𝑛 pairwise edge- disjoint triangles, for some 𝑛 β‰₯ π‘Ÿπ‘Œ βˆ’ π‘Ÿ2𝑍.

slide-79
SLIDE 79

Corollary 5.5.4 Let 𝐻 ∼ 𝐻 π‘œ, π‘ž for π‘ž β‰₯

1 3π‘œ. Then 𝔽 𝑔 𝐻

β‰₯

1 36 βˆ’ 𝑝 1

π‘œ2π‘ž.

Random Edge-Disjoint Triangles

Random graph setting

  • Let 𝐻 ∼ 𝐻 π‘œ, π‘ž , π‘Œ = # triangles, 𝑍 = # overlapping pairs of triangles
  • Lemma 5.5.3 β‡’ 𝑔 𝐻 β‰₯ π‘Ÿπ‘Œ βˆ’ π‘Ÿ2𝑍 for all π‘Ÿ ∈ 0,1
  • β‡’ 𝔽 𝑔 𝐻

β‰₯ π‘Ÿπ”½ π‘Œ βˆ’ π‘Ÿ2𝔽 𝑍

Choosing values

  • We have 𝔽 π‘Œ =

π‘œ 3 π‘ž3, 𝔽 𝑍 = π‘œ 2 π‘œβˆ’2 2

π‘ž5

  • Calculus β‡’ optimal π‘Ÿ =

1 3π‘œπ‘ž2

slide-80
SLIDE 80

Theorem 5.5.5 Let π‘ž β‰₯

1 3π‘œ and let 𝐻 ∼ 𝐻 π‘œ, π‘ž . Then

β„™ 𝐿3 ⊈ 𝐻 ≀ exp βˆ’Ξ© π‘œ2π‘ž2 .

Immeasurable Joy

Recall

  • 𝐻 ∼ 𝐻 π‘œ, π‘ž
  • 𝑔 𝐻 = maximum number of pairwise edge-disjoint triangles in 𝐻
  • Corollary 5.5.4 β‡’ if π‘ž β‰₯ 1/ 3π‘œ, then 𝔽 𝑔 𝐻

β‰₯ Ξ© π‘œ2π‘ž

  • Corollary 5.4.8 β‡’ β„™ 𝐻 triangleβˆ’free ≀ exp βˆ’π”½ 𝑔 𝐻

2/π‘œ2

  • Improves previous exponent when π‘‘π‘œβˆ’1/2 ≀ π‘ž ≀ cβ€²n
slide-81
SLIDE 81

Any questions?

slide-82
SLIDE 82

Β§6 Chromatic Number

Chapter 5: Concentration The Probabilistic Method

slide-83
SLIDE 83

Introducing the Problem

General bounds

  • What makes the chromatic number large?
  • πœ“ 𝐻 β‰₯ πœ• 𝐻
  • πœ“ 𝐻 β‰₯

π‘œ 𝛽 𝐻

Complexity

  • Determining chromatic number of graphs is NP-Complete
  • Even deciding if a graph is 3-colourable is NP-Complete

Typical behaviour

  • What can we say about πœ“ 𝐻 π‘œ,

1 2

?

slide-84
SLIDE 84

Colouring Random Graphs

Question

  • What is πœ“ 𝐻 π‘œ,

1 2

?

Applying general bounds

  • Homework: with high probability, πœ• 𝐻 π‘œ,

1 2

∼ 2 log π‘œ

  • β‡’ πœ“ 𝐻 π‘œ,

1 2

β‰₯ 2 + 𝑝 1 log π‘œ

  • Symmetry β‡’ 𝛽 𝐻 π‘œ,

1 2

∼ 2 log π‘œ

  • β‡’ πœ“ 𝐻 π‘œ,

1 2

β‰₯

1+𝑝 1 π‘œ 2 log π‘œ

  • Homework: will show this bound is sharp
slide-85
SLIDE 85

Lemma 5.6.1 The parameter πœ“ 𝐻 is 1-vertex-Lipschitz.

Honing In

  • Can we further narrow down the likely values of πœ“ 𝐻 π‘œ,

1 2

? Proof

  • Let 𝑀 ∈ π‘Š 𝐻 be arbitrary, and let 𝐼 = 𝐻 π‘Š βˆ– 𝑀
  • Chromatic number is monotone increasing
  • β‡’ πœ“ 𝐻 β‰₯ πœ“ 𝐼
  • Can always assign 𝑀 a new colour
  • β‡’ πœ“ 𝐻 ≀ πœ“ 𝐼 + 1
  • β‡’ changing 𝐻 at 𝑀 can change πœ“(𝐻) by at most one

∎

slide-86
SLIDE 86

Colouring with Martingales

Proof

  • Apply the vertex-exposure martingale to the parameter πœ“ 𝐻
  • π‘Žπ‘— = 𝔽 πœ“ 𝐻 𝐻 𝑗

βˆ’ 𝔽 πœ“ 𝐻 , 0 ≀ 𝑗 ≀ π‘œ

  • Lemma 5.6.1: πœ“ 𝐻 is 1-vertex-Lipschitz
  • Azuma’s inequality: β„™ π‘Žπ‘— β‰₯ 𝑏 ≀ 2 exp βˆ’π‘2/2π‘œ
  • If 𝑏 =

2π‘œ ln

2 𝜁, right-hand size is 𝜁

  • β‡’ can take π½π‘œ = 𝜈 βˆ’ 𝑏, 𝜈 + 𝑏 , where 𝜈 = 𝔽 𝐻

∎

Theorem 5.6.2 For 𝜁 > 0 there is a constant 𝐷 = 𝐷 𝜁 such that for every π‘œ there is an interval π½π‘œ βŠ† π‘œ of length 𝐷 π‘œ such that, for 𝐻 ∼ 𝐻 π‘œ,

1 2 ,

β„™ πœ“ 𝐻 βˆ‰ π½π‘œ ≀ 𝜁.

slide-87
SLIDE 87

Reflections on our Results

Narrow window

  • Previously saw that πœ“ 𝐻 β‰₯

1+𝑝 1 π‘œ 2 log π‘œ

= π‘œ1βˆ’π‘ 1 almost surely

  • β‡’ margin of error of 𝑃

π‘œ is relatively small

  • Theorem doesn’t say anything about where this interval is

Sparse random graphs

  • Never used that 𝐻 ∼ 𝐻 π‘œ,

1 2

  • Proof applies to 𝐻 ∼ 𝐻 π‘œ, π‘ž for any π‘ž = π‘ž π‘œ
  • However, result is trivial for sparse graphs
  • e.g.: if π‘ž = 𝑝

1 π‘œ , then 𝐻 is bipartite with high probability

  • If π‘ž ≀

𝑑 π‘œ, then with high probability Ξ” 𝐻 ≀ 𝐷 π‘œ β‡’ πœ“ 𝐻 ≀ 𝐷 π‘œ + 1

slide-88
SLIDE 88

Colouring Subgraphs of Sparse Graphs

Proof

  • If 𝐼 is 𝑒-degenerate, then πœ“ 𝐼 ≀ 𝑒 + 1
  • β‡’ if πœ“ 𝐻 𝑇

> 3 for some 𝑇, 𝐻 𝑇 is not 2-degenerate

  • β‡’ 𝐻 contains some subgraph 𝐼 with 𝑀 𝐼 ≀ 𝑑 π‘œ and πœ€ 𝐼 β‰₯ 3
  • β‡’ 𝑓 𝐼 β‰₯

3 2 𝑀 𝐼

  • Hence it suffices to show 𝐻 is unlikely to contain such a subgraph

Proposition 5.6.3 Fix 𝛽 >

5 6 and 𝑑 > 0. Then, if π‘ž = π‘œβˆ’π›½ and 𝐻 ∼ 𝐻 π‘œ, π‘ž , with high

probability 𝐻 has the property that, for every set 𝑇 of 𝑑 π‘œ vertices, πœ“ 𝐻 𝑇 ≀ 3.

slide-89
SLIDE 89

Subgraphs of Sparse Random Graphs are Sparse

Goal

  • Every subgraph 𝐼 βŠ† 𝐻 on at most 𝑑 π‘œ vertices has at most

3𝑀 𝐼 2

edges

Proof (cont’d)

  • Number of choices for π‘Š 𝐼 :

π‘œ 𝑒

≀

π‘œπ‘“ 𝑒 𝑒

  • Number of choices of

3𝑀 𝐼 2

edges of 𝐼:

𝑒 2 3𝑒 2

≀

𝑒 2 𝑓 3𝑒 2 3𝑒 2

≀

𝑒𝑓 3

3𝑒 2

  • β‡’ β„™ βˆƒ bad 𝐼 on 𝑒 vertices ≀

π‘œπ‘“ 𝑒 𝑒 𝑒𝑓 3

3𝑒 2 π‘ž 3𝑒 2 ≀ π‘“π‘œ1βˆ’3𝛽/2𝑒1/2 𝑒

  • Since 𝑒 < 𝑑 π‘œ, this is at most π‘‘β€²π‘œ5/4βˆ’3𝛽/2 𝑒
  • As 𝛽 >

5 6, exponent of π‘œ is negative

  • β‡’ summing over all 𝑒, β„™ βˆƒ bad 𝐼 = 𝑝 1

∎

slide-90
SLIDE 90

Wow, Much Precise

Proof idea

  • Enough to focus on likely values of πœ“ 𝐻
  • Consider the smallest 𝑣 such that β„™ πœ“ 𝐻 ≀ 𝑣 = Ξ© 1
  • Show that one can colour most vertices of 𝐻 with 𝑣 colours
  • Use Proposition 5.6.3 for the rest

Theorem 5.6.4 (Shamir-Spencer, 1987) Fix 𝛽 >

5 6 and set π‘ž = π‘œβˆ’π›½. There is some 𝑣 = 𝑣 π‘œ, π‘ž such that if 𝐻 ∼

𝐻 π‘œ, π‘ž , then almost surely 𝑣 ≀ πœ“ 𝐻 ≀ 𝑣 + 3.

slide-91
SLIDE 91

A Wise Choice of Graph Parameter

Proof

  • Suffices to show that for any 𝜁 > 0, there is 𝑣 = 𝑣 π‘œ, π‘ž, 𝜁 such that

β„™ 𝑣 ≀ πœ“ 𝐻 ≀ 𝑣 + 3 β‰₯ 1 βˆ’ 3𝜁

  • Define 𝑣 = 𝑣 π‘œ, π‘ž, 𝜁 to be smallest 𝑣 such that β„™ πœ“ 𝐻 ≀ 𝑣 β‰₯ 𝜁
  • β‡’ β„™ πœ“ 𝐻 ≀ 𝑣 βˆ’ 1 < 𝜁
  • Now wish to show that most vertices can be 𝑣-coloured
  • Define 𝑔 𝐻 = minimum size of 𝑇 βŠ† π‘Š 𝐻 such that πœ“ 𝐻 π‘Š βˆ– 𝑇

≀ 𝑣

  • β„™ 𝑔 𝐻 = 0 = β„™ πœ“ 𝐻 ≀ 𝑣 β‰₯ 𝜁

Theorem 5.6.4 Fix 𝛽 >

5 6 and set π‘ž = π‘œβˆ’π›½. There is some 𝑣 = 𝑣 π‘œ, π‘ž such that if 𝐻 ∼

𝐻 π‘œ, π‘ž , then almost surely 𝑣 ≀ πœ“ 𝐻 ≀ 𝑣 + 3.

slide-92
SLIDE 92

Setting Up Azuma

Recall

  • 𝑣: least integer such that β„™ πœ“ 𝐻 ≀ 𝑣 β‰₯ 𝜁
  • 𝑔(𝐻): minimum size of 𝑇 such that πœ“ 𝐻 π‘Š βˆ– 𝑇

≀ 𝑣

Lipschitz

  • Fix a vertex 𝑀 ∈ π‘Š 𝐻
  • Choose a minimum set 𝑇′ whose removal from 𝐻 π‘Š βˆ– 𝑀

β‡’ πœ“ ≀ 𝑣

  • Worst-case: can always take 𝑇 = 𝑇′ βˆͺ 𝑀
  • β‡’ 𝑔 is 1-vertex-Lipschitz

Martingale

  • Run the vertex-exposure martingale on 𝑔 𝐻
slide-93
SLIDE 93

Completing the Proof

Recall

  • 𝑔(𝐻): minimum size of 𝑇 such that πœ“ 𝐻 π‘Š βˆ– 𝑇

≀ 𝑣; let 𝜈 = 𝔽 𝑔 𝐻

  • β„™ 𝑔 𝐻 = 0 β‰₯ 𝜁

Concentration

  • Azuma’s Inequality β‡’ β„™ 𝑔 𝐻 ≀ 𝜈 βˆ’ 𝑏 ≀ exp βˆ’π‘2/2π‘œ
  • β‡’ 𝜁 ≀ β„™ 𝑔 𝐻 = 0 ≀ exp βˆ’πœˆ2/2π‘œ
  • β‡’ 𝜈 ≀

2π‘œ ln 1/𝜁

  • Azuma’s Inequality β‡’ β„™ 𝑔 𝐻 β‰₯ 𝜈 + 𝑏 ≀ exp βˆ’π‘2/2π‘œ
  • β‡’ β„™ 𝑔 𝐻 β‰₯ 𝜈 +

2π‘œ ln 1/𝜁 ≀ 𝜁

  • β‡’ β„™ 𝑔 𝐻 β‰₯ 2 2π‘œ ln 1/𝜁 ≀ 𝜁

And voila

  • β‡’ can remove 𝑑 π‘œ vertices and 𝑣 colour the rest
  • Proposition 5.6.3 β‡’ can 3-colour removed vertices with probability 1 βˆ’ 𝜁

∎

slide-94
SLIDE 94

Epilogue

Location of interval

  • Again, proof only shows concentration
  • Actual value of chromatic number not needed
  • Concern: didn’t our choice of 𝑣 depend on 𝜁?
  • Suppose 𝑣 = 𝑣(π‘œ, π‘ž, 𝜁) and 𝑣′ = 𝑣 π‘œ, π‘ž, πœβ€²
  • We proved β„™ πœ“ 𝐻 ∈ 𝑣, 𝑣 + 3

β‰₯ 1 βˆ’ 𝜁, β„™ πœ“ 𝐻 ∈ 𝑣′, 𝑣′ + 3 β‰₯ 1 βˆ’ πœβ€²

  • β‡’ β„™ πœ“ 𝐻 ∈ 𝑣, 𝑣 + 3 ∩ 𝑣′, 𝑣′ + 3

β‰₯ 1 βˆ’ 𝜁 βˆ’ πœβ€²

  • β‡’ Different 𝑣’s give an even stronger concentration inequality

Further results

  • Alon-Krivelevich (1997): if 𝛽 >

1 2 and π‘ž = π‘œβˆ’π›½, there is some 𝑣 = 𝑣 π‘œ, π‘ž such

that πœ“ 𝐻 π‘œ, π‘ž ∈ 𝑣, 𝑣 + 1 with high probability

  • Heckel-Riordan (2020+): if 𝐽 βŠ† π‘œ is an interval such that πœ“ 𝐻 π‘œ, 1/2

∈ 𝐽 with high probability, then 𝐽 = π‘œ1/2βˆ’π‘ 1

slide-95
SLIDE 95

Any questions?