Chapter 5: Concentration
The Probabilistic Method Summer 2020 Freie UniversitΓ€t Berlin
Chapter 5: Concentration The Probabilistic Method Summer 2020 - - PowerPoint PPT Presentation
Chapter 5: Concentration The Probabilistic Method Summer 2020 Freie Universitt Berlin Chapter Overview Prove some strong concentration inequalities Improve bounds on Ramsey numbers Study Hamiltonicity and chromatic number of
The Probabilistic Method Summer 2020 Freie UniversitΓ€t Berlin
Chapter 5: Concentration The Probabilistic Method
Homework exercise
Degrees in π» π, π
1 2π
β₯ π β 1 π β π >
1 2
Corollary 2.2.5 Let π» be an π-vertex graph with π π» β₯ π. Then π» has a dominating set π β π π» with π β€
ln π+1 +1 π+1
π.
Concentration inequalities
1 2
β π β₯ π β€
π½ π π
1 2π for π = Ξ© π
β π β π½ π β₯ π β€
Var π π2
1 2π for π = Ξ© π π
β π β π½ π β₯ π β€ exp
βππ2 Var π
1 2π for π = Ξ©
ππ log π
Definition 5.1.1 Let ππ = Οπ=1
π
ππ, where the ππ are independently and uniformly distributed on β1,1 .
Asymptotics
Binomial connection
1 2 β π 2
Goal
Theorem 5.1.2 (Symmetric Chernoff Bound) For every π > 0, we have β ππ β₯ π β€ exp β
π2 2π .
Remarks
1 2 = 1 2 ππ + π
Corollary 5.1.3 For every π > 0, we have β Bin π,
1 2 β π 2 β₯ π β€ 2 exp β2π2 π
.
Proof
π
ππ
π
ππππ
π
π½ ππππ =
1 2 ππ + πβπ π
= coshπ π
Theorem 5.1.2 (Symmetric Chernoff Bound) For every π > 0, we have β ππ β₯ π β€ exp β
π2 2π .
Recall
A little calculus
1 2 ππ¦ + πβπ¦
π¦2 2 + π¦3 6 + π¦4 24 + β―
π¦2 2 + π¦4 24 + π¦6 720 + β― β€ 1 + π¦2 2 + π¦4 8 + π¦6 48 + β― = π
π¦2 2
Finishing the proof
1 2 ππ2 β ππ
π π β β ππ β₯ π β€ exp β π2 2π
β
Shortcomings
Wider Framework
π
ππ
π
ππ
Theorem 5.1.4 (Asymmetric Chernoff Bound) Let π > 0 and let π and π be as above. Then β π β€ βπ β€ exp β
π2 2ππ
and β π β₯ π β€ exp
βπ2 2ππ + π3 2 ππ 2 .
Theorem 5.1.4 (Asymmetric Chernoff Bound) Let π > 0 and let π and π be as above. Then β π β€ βπ β€ exp
βπ2 2ππ
and β π β₯ π β€ exp
βπ2 2ππ + π3 2 ππ 2 .
Special case
βπ2 2ππ + π3 2 ππ 2
Corollary 5.1.5 For every π > 0 there is some ππ > 0 such that, if π is the sum of mutually independent indicator random variables and π = π½ π , then β π β π β₯ ππ β€ 2 exp βπππ .
Chapter 5: Concentration The Probabilistic Method
Goal
Upper bound
π+1 2
= O π2
Lower bounds
π 3, π = Ξ©
π log π 3/2
π 3, π = Ξ©
π log π 2
Proof sketch
Optimisation
π log π 3/2
Theorem 2.1.2 (β = 3) For every π, π β β and π β [0,1], we have π 3, π > π β π 3 π3 β π π 1 β π
π 2 .
Vertex removal
Edge removal
Detriangulation
Independent sets
2 π edges
Theorem 5.1.4 (Asymmetric Chernoff Bound) Let π > 0 and let π and π be as before. Then β π β€ βπ β€ exp
βπ2 2ππ .
Local edge counts
βΌ Bin
π 2 , π
2 π edges, how likely are we to see at least half of that?
Applying Chernoff
π 2 , π β π 2 π, and let π = 1 2 π 2 π
β€
1 2 π 2 π β€ exp β 1 8 π 2 π
Recall
β€
1 2 π 2 π β€ exp β 1 8 π 2 π
Union bound
β€
1 2 π 2 π β€ π π exp β 1 8 π 2 π β€ exp π ln π β 1 8 π 2 π
Setting parameters
1 10 π 2 π, say
20 ln π πβ1
1 2 π 2 π = 5π ln π edges
Recall
20 ln π πβ1 β almost surely, every π-set has at least 5π ln π edges
New independent sets
How many edges do we lose?
3 + π 2
π β π
3 + π 2
π β π π3 β
4000 3
ln3 π + 4000
πβπ π ln3 π β 4000 π π ln3 π
Recall
π π ln3 π triangles with an edge in π
Setting more parameters
π π ln3 π β€ ππ ln π
π ln π 2
= πβ²
π log π 2
Large deviations
π many sets
Lemma 5.2.1 (ErdΕs-Tetali, 1990) Let πΉ1, πΉ2, β¦ , πΉπ be a collection of events and set π = Οπ=1
π β πΉπ . For
any π‘, β πΉπ1 β© πΉπ2 β© β― β© πΉππ‘ for some independent πΉπ1, πΉπ2, β¦ , πΉππ‘ β€ ππ‘ π‘! .
Concentration inequalities
π summands in union bound
Saving grace
Proof
β€ Ο π1,β¦,ππ‘ ind β πΉπ1 β© β― β© πΉππ‘ =
1 π‘! Ο π1,β¦,ππ‘ ind β πΉπ1 β© β― β© πΉππ‘
=
1 π‘! Ο π1,β¦,ππ‘ ind Οπ=1 π‘
β πΉππ β€
1 π‘! Ο π1,β¦,ππ‘ β π π‘ Οπ=1 π‘
β πΉππ =
1 π‘! Οπβ π β πΉπ π‘
=
ππ‘ π‘!
β
Lemma 5.2.1 (ErdΕs-Tetali, 1990) Let πΉ1, πΉ2, β¦ , πΉπ be a collection of events and set π = Οπ=1
π β πΉπ . For
any π‘, β πΉπ1 β© πΉπ2 β© β― β© πΉππ‘ for some independent πΉπ1, πΉπ2, β¦ , πΉππ‘ β€ ππ‘ π‘! .
Recall
ErdΕs-Tetali
ππ‘ π‘!
Calculation
π‘ π π‘
ππ‘ π‘! β€ ππ π‘ π‘
β€ ππ π ln π < πβπ if π < πβ2
Theorem 5.2.2 (ErdΕs, 1961; Krivelevich, 1995) As π β β, π 3, π = Ξ©
π log π 2
.
Union bound
π < ππ sets β with high probability, every π-set:
Alteration
π log π 2
and π =
20 ln π πβ1
β
Theorem 1.5.5 (ErdΕs-Szekeres, 1935) For all β, π β β, π β, π β€ β + π β 2 β β 1 = π πββ1 . In particular, π 3, π = π(π2).
Lower bounds
π log π 2
Narrowing the gap
Proof
π β 1
π Ξ π» +1 β₯
π β
Ramsey numbers
Proposition 5.2.3 If π» is an π-vertex triangle-free graph, π½ π» β₯ π β 1.
Theorem 5.2.4 (Ajtai, KomlΓ³s, SzemerΓ©di, 1980; Shearer, 1995) If π» is an π-vertex triangle-free graph with maximum degree Ξ, then π½ π» β₯ π log Ξ 8Ξ .
Greedy algorithm
Proof
8π2 log π and let π» be an π-vertex triangle-free graph
π log Ξ 8Ξ
β₯
π log π 8π
= π β
Corollary 5.2.5 As π β β, π 3, π β€
8π2 log π.
Randomness
π€ = 1 π€βπ½ , then π½ = Οπ€ π π€
= Οπ€ π½[π
π€] = Οπ€ β π€ β π½
Neighbourhoods
Lemma 5.2.6 If Ξ β₯ 16, we have π½ ππ€ β₯
log Ξ 4
for every π€.
Local variables
Proof
π Ξ+1
π log Ξ 4
β
Theorem 5.2.4 (Ajtai, KomlΓ³s, SzemerΓ©di, 1980; Shearer, 1995) If π» is an π-vertex triangle-free graph with maximum degree Ξ, then π½ π» β₯ π log Ξ 8Ξ .
Proof
π€ βͺ π π€
log Ξ 4
for every independent πΎ in πΌ
Lemma 5.2.5 If Ξ β₯ 16, we have π½ ππ€ β₯
log Ξ 4
for every π€.
Goal
log Ξ 4
Available neighbours
Independent extensions
Recall
log Ξ 4
Conditional Expectation
1 2π+1
2π 2π+1
=
π 2
Ξ 2π+1 + a2aβ1 2a+1
Recall
Ξ 2π+1 + π2πβ1 2π+1
log Ξ 4
Contradiction
log Ξ 4
>
Ξ 2π+1 + π2πβ1 2π+1
Ξ
Ξ > 4Ξ β log Ξ
β
Theorem 5.2.7 (Kim, 1995) As π β β, π 3, π = Ξ©
π2 log π .
What we know
π2 log2 π = π 3, π = π π2 log π
Remarks
Chapter 5: Concentration The Probabilistic Method
Definition 5.3.1 A Hamiltonian cycle in a graph π» is a cycle passing through every vertex
Questions
Theorem 5.3.2 (Karp, 1972) Deciding whether a graph is Hamiltonian is NP-Complete.
Theorem 5.3.3 (Dirac, 1952) Every π-vertex graph π» with minimum degree π π» β₯
π 2 is Hamiltonian.
Optimal bound
Corollary 5.3.4 For every π > 0 and π β₯
1 2 + π π, π» π, π is Hamiltonian w.h.p.
Proposition 5.3.5 For every π > 0 and π β€
1βπ log π π
, π» π, π is w.h.p. not Hamiltonian.
First moment
πβ1 ! 2
=
π 1+π 1 π π
possible Hamiltonian cycles
ππ 1+π 1 π π
πβπ π , then π» π, π has no Hamiltonian cycles w.h.p.
Connectivity
Proof
π€0, π€π’ β π, as otherwise path could be extended
β
Theorem 5.3.3 (Dirac, 1952) Every π-vertex graph π» with minimum degree π π» β₯
π 2 is Hamiltonian.
More than existential
Random setting
Definition 5.3.6 (Booster) Given a graph π», a booster is a potential edge π such that π» βͺ π contains a longer path or a Hamiltonian cycle.
Goal
Rotations
Proof
β
Lemma 5.3.7 Let π = π€0π€1 β¦ π€π’ be a longest path in a graph π», and let π be the set
ππ» π β ππ π .
Corollary 5.3.8 Let π be a longest path in π», and let π be the set of endpoints following sequences of rotations. Then ππ» π β€ 2 π β 1.
Proof
β€ 2 π β 1 β
Definition 5.3.9 (Expander) A graph π» is a π, 2 -expander if, for every π β π π» with π β€ π, we have ππ» π β₯ 2 π .
Proof
β€ 2 π0 β 1
β
Corollary 5.3.10 If π» is a connected (π, 2)-expander, then π» has at least
1 2 π2 boosters.
Assumptions
1+π log π π
Rotation-Extension process
Recall
Problem
2
Solution
1 β π
Proof
π 6 such that π π
< 2π‘
π‘
β€
ππ π‘ π‘
choices for π,
πβπ‘ 2π‘
β€
π 2π‘ β€ ππ 2π‘ 2π‘
choices for π
π 6
π3π3 4π‘3 πβππ/2 π‘
β€ Οπ‘=1
π 6
π3 4 π π‘
= π 1 β
Lemma 5.3.11 If π β₯
7 log π π
, then π» π, π is w.h.p. an
π 6 , 2 -expander.
Proof
7 log π π
and π =
73 log π π2
1 β π π β€
80 log π π2
π 6 , 2 -expander
π2 72 boosters
π2 72 β€ π βππ2 72 = π
1 π
β
Theorem 5.3.12 (PΓ³sa, 1976) If π β₯
80 log π π
, then π» π, π is w.h.p. Hamiltonian.
Theorem 5.3.13 (KomlΓ³s-SzemerΓ©di, 1983) For π > 0 and π β₯
1+π log π π
, π» π, π is w.h.p. Hamiltonian.
Theorem 5.3.14 (BollobΓ‘s, 1984; Ajtai-KomlΓ³s-SzemerΓ©di, 1985) In the random graph process, w.h.p. the graph becomes Hamiltonian precisely when the minimum degree is at least two.
Chapter 5: Concentration The Probabilistic Method
Triangular case
Upper tail
Theorem 3.3.1 For β β₯ 2, the threshold for πΏβ β π» π, π is π0 π = πβ2/(ββ1).
Indicator random variables
π 3 , let ππ be the indicator that π» π β πΏ3
Stronger concentration
1 2 π 3 π3
Lemma 5.4.1 There exists a family of
1 3 πβ1 2
pairwise edge-disjoint triangles in πΏπ.
Cheap fix
Proof
1 π π 3 = 1 3 πβ1 2
β
Proof
π° β€ exp β π° π3
β
Good news
Bad news
2
π3 = Ξ π2π3 is of lower order than expected
Corollary 5.4.2 π» π, π is triangle-free with probability at most exp β
1 3 πβ1 2
π3 .
Improving the exponent
3 possible triangles
Revisiting Chernoff
π
ππ
Definition 5.4.3 (Martingale) A martingale is a sequence π0, π1, β¦ , ππ of random variables such that, for each 1 β€ π β€ π, we have π½ ππ π
π: π < π
= ππβ1. Loosely speaking, given what has previously transpired, we expect nothing to change in the πth step.
Conditional independence
π: π < π
π: π < π
has the right properties, can prove Chernoff-type bounds
Boring mathsy example
π
π: π < π
= π½ ππβ1 + ππ π
π: π < π
= ππβ1 + π½ ππ| π
π: π < π
π: π < π
= π½ ππ = 0
Fun real-world example
π: π < π
=
1 2 Ziβ1 + bi + 1 2 Ziβ1 β bi = Ziβ1
Disclaimer: gambling can be addictive and bad for your bank balance
Proof
π ππ
π: π < π
= 0
π π½ ππππ π π: π < π
Theorem 5.4.4 (Azumaβs Inequality) Let π0, π1, β¦ , ππ be a martingale with π0 = 0 and ππ β ππβ1 β€ 1 for all 1 β€ π β€ π. Then, for any a > 0, we have β ππ β₯ π β€ exp βπ2/2π .
Proof
ππ+πβπ 2
+
ππβπβπ 2
π§ = ππ
1 2 + π§ 2 + πβπ 1 2 β π§ 2
β€ π½ π π =
ππ+πβπ 2
+
ππβπβπ 2
π½ π = cosh π β
Lemma 5.4.5 If π > 0 and π is a random variable with π½ π = 0 and π β€ 1, then π½ πππ β€ cosh π .
Proof (contβd)
π π½ ππππ π π: π < π
π: π < π
β€ cosh π β€ ππ2/2
π2π 2 β ππ
π π
β
Theorem 5.4.4 (Azumaβs Inequality) Let π0, π1, β¦ , ππ be a martingale with π0 = 0 and ππ β ππβ1 β€ 1 for all 1 β€ π β€ π. Then, for any a > 0, we have β ππ β₯ π β€ exp βπ2/2π .
Upper tail for triangles
π 3 ππ, with ππ the indicator that π» π β‘ πΏ3
Where is the martingale?
1, π 2, β¦ , π π
π: π < π
= 0 for all choices of π
π
General framework
Revealing π»
π 2
= π1, π2, β¦ , ππ for π =
π 2
π: π β€ π
The martingale
β π½ π π» = 0
π 2
β π½ π π» = π π» β π½ π π»
Framework
1 2 and π π» = π π»
π» π π» 1 2 2 2 2 2 2 3 1.5 2 2 2.5 1.75 2.25 2
Recall
π: π β€ π
Conditional expectations
= π½ π π» πΉ π» β© ππ = ππ
Definition 5.4.6 (π-Lipschitz) Let π > 0. A graph parameter π is π-(edge-)Lipschitz if, for any edge π, π π» β π π» β³ π β€ π.
Bounded differences
Fact 5.4.7 Given a π-Lipschitz parameter π, we have ππ β ππβ1 β€ 1 for the normalised Doob martingale ππ =
1 π π½ π π» πΉ π» β© ππ β π½ π π»
.
Theorem 5.4.4 (Azumaβs Inequality) Let π0, π1, β¦ , ππ be a martingale with π0 = 0 and ππ β ππβ1 β€ 1 for all 1 β€ π β€ π. Then, for any a > 0, we have β ππ β₯ π β€ exp βπ2/2π .
Remarks
Corollary 5.4.8 Let π be a π-Lipschitz graph parameter, π» βΌ π» π, π , π = π½ π π» , and π > 0. Then β π π» β₯ π + π β€ exp βπ2/π2π2 .
Chapter 5: Concentration The Probabilistic Method
Theorem 3.3.1 For β β₯ 2, the threshold for πΏβ β π» π, π is π0 π = πβ2/(ββ1).
Triangle-freeness
= π 1
Exponential error bounds
Corollary 5.4.2 π» π, π is triangle-free with probability at most exp β
1 3 πβ1 2
π3 .
Corollary 5.4.8β Let π be a π-Lipschitz graph parameter, π» βΌ π» π, π , π = π½ π π» , and π > 0. Then β π π» β€ π β π β€ exp βπ2/π2π2 .
Counting triangles
π 3 π3 = a
Corollary 5.5.1 π» π, π is triangle-free with probability at most exp
β πβ1 2π6 36
.
Corollary 5.5.2 Let π be a ππ€-vertex-Lipschitz parameter, π = π½ π π» , and π > 0. Then, for π» βΌ π»(π, π), β π π» β€ π β π β€ exp βπ2/2πππ€
2 .
Worse exponent
1 36 π β 1 2π6 is worse than the 1 3 πβ1 2
π3 from before
2 , and large Lipschitz constant, π β 2
Vertex-exposure martingale
2
Corollary 5.4.8 Let π be a π-Lipschitz graph parameter, π» βΌ π» π, π , π = π½ π π» , and π > 0. Then β π π» β₯ π + π β€ exp βπ2/π2π2 .
Reducing the Lipschitz constant
New bound
2/π2
Proof
β₯ π½ ββ² β πβ² = ππ β π2π β
Lemma 5.5.3 Let π β 0,1 , and let π» be a graph with π triangles and π pairs of triangles sharing an edge. Then π» has a collection of π pairwise edge- disjoint triangles, for some π β₯ ππ β π2π.
Corollary 5.5.4 Let π» βΌ π» π, π for π β₯
1 3π. Then π½ π π»
β₯
1 36 β π 1
π2π.
Random graph setting
β₯ ππ½ π β π2π½ π
Choosing values
π 3 π3, π½ π = π 2 πβ2 2
π5
1 3ππ2
Theorem 5.5.5 Let π β₯
1 3π and let π» βΌ π» π, π . Then
β πΏ3 β π» β€ exp βΞ© π2π2 .
Recall
β₯ Ξ© π2π
2/π2
Chapter 5: Concentration The Probabilistic Method
General bounds
π π½ π»
Complexity
Typical behaviour
1 2
?
Question
1 2
?
Applying general bounds
1 2
βΌ 2 log π
1 2
β₯ 2 + π 1 log π
1 2
βΌ 2 log π
1 2
β₯
1+π 1 π 2 log π
Lemma 5.6.1 The parameter π π» is 1-vertex-Lipschitz.
1 2
? Proof
β
Proof
β π½ π π» , 0 β€ π β€ π
2π ln
2 π, right-hand size is π
β
Theorem 5.6.2 For π > 0 there is a constant π· = π· π such that for every π there is an interval π½π β π of length π· π such that, for π» βΌ π» π,
1 2 ,
β π π» β π½π β€ π.
Narrow window
1+π 1 π 2 log π
= π1βπ 1 almost surely
π is relatively small
Sparse random graphs
1 2
1 π , then π» is bipartite with high probability
π π, then with high probability Ξ π» β€ π· π β π π» β€ π· π + 1
Proof
> 3 for some π, π» π is not 2-degenerate
3 2 π€ πΌ
Proposition 5.6.3 Fix π½ >
5 6 and π > 0. Then, if π = πβπ½ and π» βΌ π» π, π , with high
probability π» has the property that, for every set π of π π vertices, π π» π β€ 3.
Goal
3π€ πΌ 2
edges
Proof (contβd)
π π’
β€
ππ π’ π’
3π€ πΌ 2
edges of πΌ:
π’ 2 3π’ 2
β€
π’ 2 π 3π’ 2 3π’ 2
β€
π’π 3
3π’ 2
ππ π’ π’ π’π 3
3π’ 2 π 3π’ 2 β€ ππ1β3π½/2π’1/2 π’
5 6, exponent of π is negative
β
Proof idea
Theorem 5.6.4 (Shamir-Spencer, 1987) Fix π½ >
5 6 and set π = πβπ½. There is some π£ = π£ π, π such that if π» βΌ
π» π, π , then almost surely π£ β€ π π» β€ π£ + 3.
Proof
β π£ β€ π π» β€ π£ + 3 β₯ 1 β 3π
β€ π£
Theorem 5.6.4 Fix π½ >
5 6 and set π = πβπ½. There is some π£ = π£ π, π such that if π» βΌ
π» π, π , then almost surely π£ β€ π π» β€ π£ + 3.
Recall
β€ π£
Lipschitz
β π β€ π£
Martingale
Recall
β€ π£; let π = π½ π π»
Concentration
2π ln 1/π
2π ln 1/π β€ π
And voila
β
Location of interval
β₯ 1 β π, β π π» β π£β², π£β² + 3 β₯ 1 β πβ²
β₯ 1 β π β πβ²
Further results
1 2 and π = πβπ½, there is some π£ = π£ π, π such
that π π» π, π β π£, π£ + 1 with high probability
β π½ with high probability, then π½ = π1/2βπ 1