chapter 3 the second moment
play

Chapter 3: The Second Moment The Probabilistic Method Summer 2020 - PowerPoint PPT Presentation

Chapter 3: The Second Moment The Probabilistic Method Summer 2020 Freie Universitt Berlin Chapter Overview Introduce the second moment method Survey applications in graph theory and number theory 1 Concentration Inequalities


  1. Chapter 3: The Second Moment The Probabilistic Method Summer 2020 Freie Universität Berlin

  2. Chapter Overview • Introduce the second moment method • Survey applications in graph theory and number theory

  3. §1 Concentration Inequalities Chapter 3: The Second Moment The Probabilistic Method

  4. What Does the Expectation Mean? Basic fact • 𝑌 ≤ 𝔽 𝑌 and {𝑌 ≥ 𝔽 𝑌 } have positive probability • Often want more quantitative information • What are these positive probabilities? • How much below/above the expectation can the random variable be? Limit laws • Law of large numbers • Average of independent trials will tend to the expectation • Central limit theorem • Average will be normally distributed Not always applicable • We often only have a single instance, or lack independence • Can still make use of more general bounds

  5. Markov’s Inequality Theorem 3.1.1 (Markov’s Inequality) Let 𝑌 be a non-negative random variable, and let 𝑏 > 0 . Then ℙ 𝑌 ≥ 𝑏 ≤ 𝔽 𝑌 . 𝑏 Proof • Let 𝑔 be the density function for the distribution of 𝑌 ∞ 𝑦𝑔 𝑦 ⅆ𝑦 = ׬ 𝑏 𝑦𝑔 𝑦 ⅆ𝑦 + ׬ ∞ 𝑦𝑔 𝑦 ⅆ𝑦 • 𝔽 𝑌 = ׬ 0 0 𝑏 ∞ 𝑦𝑔 𝑦 ⅆ𝑦 ≥ ׬ ∞ 𝑏𝑔 𝑦 ⅆ𝑦 = 𝑏 ׬ ∞ 𝑔 𝑦 ⅆ𝑦 = 𝑏ℙ(𝑌 ≥ 𝑏) ≥ ׬ ∎ 𝑏 𝑏 𝑏 Moral: 𝔽 𝑌 small ⇒ 𝑌 typically small

  6. Chebyshev’s Inequality Converse? Does 𝔽 𝑌 large ⇒ 𝑌 typically large? • Not necessarily; e.g. 𝑌 = 𝑜 2 with probability 𝑜 −1 , 0 otherwise • But such random variables have large variance… Theorem 3.1.2 (Chebyshev’s Inequality) Let 𝑌 be a random variable, and let 𝑏 > 0 . Then ≥ 𝑏 ≤ Var 𝑌 ℙ 𝑌 − 𝔽 𝑌 . 𝑏 2 Proof 2 ≥ 𝑏 2 • 𝑌 − 𝔽 𝑌 ≥ 𝑏 = 𝑌 − 𝔽 𝑌 2 • Let 𝑍 = 𝑌 − 𝔽 𝑌 • Then 𝔽 𝑍 = Var 𝑌 • Apply Markov’s Inequality ∎

  7. Using Chebyshev Moral • 𝔽 𝑌 large and Var 𝑌 small ⇒ 𝑌 typically large • Special case: showing 𝑌 nonzero Corollary 3.1.3 If Var 𝑌 = 𝑝 𝔽 𝑌 2 , then ℙ 𝑌 = 0 = 𝑝 1 . Proof • 𝑌 = 0 ⊆ 𝑌 − 𝔽 𝑌 ≥ 𝔽 𝑌 ≤ Var 𝑌 • Chebyshev ⇒ ℙ 𝑌 − 𝔽 𝑌 ≥ 𝔽 𝑌 𝔽 𝑌 2 = 𝑝(1) ∎ • In fact, in this case 𝑌 = 1 + 𝑝 1 𝔽[𝑌] with high probability

  8. Typical application Set-up • 𝐹 𝑗 events, occurring with probability 𝑞 𝑗 • 𝑌 𝑗 = 1 𝐹 𝑗 their indicator random variables • 𝑌 = σ 𝑗 𝑌 𝑗 their sum, the number of occurring events Goal • Show that with high probability, some event occurs Applying Chebyshev • Need to show Var 𝑌 = 𝑝 𝔽 𝑌 2 Expand the variance • Var 𝑌 = Var σ 𝑗 𝑌 𝑗 = σ 𝑗 Var(𝑌 𝑗 ) + σ 𝑗≠𝑘 Cov(𝑌 𝑗 , 𝑌 𝑘 )

  9. Some Simplification Estimating the summands • Var 𝑌 = Var σ 𝑗 𝑌 𝑗 = σ 𝑗 Var(𝑌 𝑗 ) + σ 𝑗≠𝑘 Cov(𝑌 𝑗 , 𝑌 𝑘 ) • Var 𝑌 𝑗 = 𝑞 𝑗 1 − 𝑞 𝑗 ≤ 𝑞 𝑗 • ∴ σ 𝑗 Var(𝑌 𝑗 ) ≤ σ 𝑗 𝑞 𝑗 = σ 𝑗 𝔽 𝑌 𝑗 = 𝔽[𝑌] • Cov 𝑌, 𝑍 = 𝔽 𝑌𝑍 − 𝔽 𝑌 𝔽 𝑍 • Cov 𝑌, 𝑍 = 0 if 𝑌 and 𝑍 are independent • Otherwise Cov 𝑌 𝑗 , 𝑌 𝑘 ≤ 𝔽 𝑌 𝑗 𝑌 𝑘 = ℙ 𝐹 𝑗 ∧ 𝐹 𝑘 Corollary 3.1.4 Let 𝐹 𝑗 be a sequence of events with probabilities 𝑞 𝑗 , and let 𝑌 count the number of events that occur. Write 𝑗 ∼ 𝑘 if the events 𝐹 𝑗 and 𝐹 𝑘 are not independent, and let Δ = σ 𝑗∼𝑘 ℙ 𝐹 𝑗 ∧ 𝐹 𝑘 . If 𝔽 𝑌 → ∞ and Δ = 𝑝 𝔽 𝑌 2 , then 𝑄 𝑌 = 0 = 𝑝 1 .

  10. Any questions?

  11. §2 Thresholds Chapter 3: The Second Moment The Probabilistic Method

  12. Monotone properties Graph properties • Say a graph 𝒬 is monotone (increasing) if adding edges preserves 𝒬 • e.g.: containing a subgraph 𝐼 ⊆ 𝐻 , having 𝛽 𝐻 < 𝑙 , connectivity, … Lemma 3.2.1 If 𝒬 is a monotone increasing graph property, then ℙ 𝐻 𝑜, 𝑞 ∈ 𝒬 is monotone increasing in 𝑞 . Proof (Coupling) • Sampling 𝐻 𝑜, 𝑞 • Assign to each pair of vertices 𝑣, 𝑤 an independent uniform 𝑍 𝑣,𝑤 ~Unif 0,1 • Add edge 𝑣, 𝑤 to 𝐻 iff 𝑍 𝑣,𝑤 ≤ 𝑞 • Each edge appears independently with probability 𝑞 • If 𝑞 ≤ 𝑞′ , then 𝐻 𝑜, 𝑞 ⊆ 𝐻 𝑜, 𝑞 ′ ⇒ if 𝐻 𝑜, 𝑞 ∈ 𝒬 , then 𝐻 𝑜, 𝑞 ′ ∈ 𝒬 ∎

  13. Thresholds Transitions • A monotone property 𝒬 is nontrivial if it is not satisfied by the edgeless graph, and is satisfied by the complete graph • ⇒ ℙ 𝐻 𝑜, 0 ∈ 𝒬 = 0 and ℙ 𝐻 𝑜, 1 ∈ 𝒬 = 1 • Lemma 3.2.1 ⇒ ℙ 𝐻 𝑜, 𝑞 ∈ 𝒬 increases from 0 to 1 as 𝑞 does • How quickly does this increase happen? Definition 3.2.2 (Thresholds) Given a nontrivial monotone graph property 𝒬 , 𝑞 0 (𝑜) is a threshold for 𝒬 if ℙ 𝐻 𝑜, 𝑞 ∈ 𝒬 → ቊ 0 if p ≪ 𝑞 0 𝑜 , 1 if 𝑞 ≫ 𝑞 0 𝑜 .

  14. A Cyclic Example Proposition 3.2.3 1 The threshold for 𝐻(𝑜, 𝑞) to contain a cycle is 𝑞 0 𝑜 = 𝑜 . Proof (lower bound) • Let 𝑌 = # cycles in 𝐻(𝑜, 𝑞) • For ℓ ≥ 3 , let 𝑌 ℓ = # 𝐷 ℓ ⊆ 𝐻 𝑜, 𝑞 𝑜 • ⇒ 𝑌 = σ ℓ=3 𝑌 ℓ • Linearity of expectation: 𝔽 𝑌 ℓ ≤ 𝑜 ℓ 𝑞 ℓ 𝑜𝑞 3 𝑜𝑞 ℓ < 𝑜𝑞 3 σ ℓ=0 𝑜𝑞 ℓ = 𝑜 ∞ • ⇒ 𝔽 𝑌 ≤ σ ℓ=3 1−𝑜𝑞 1 • ⇒ 𝔽 𝑌 = 𝑝(1) if 𝑞 ≪ 𝑜 • Markov: ℙ 𝐻 𝑜, 𝑞 has a cycle = ℙ 𝑌 ≥ 1 ≤ 𝔽 𝑌 → 0 ∎

  15. Cycles Continued Proposition 3.2.3 1 The threshold for 𝐻(𝑜, 𝑞) to contain a cycle is 𝑞 0 𝑜 = 𝑜 . Proof (upper bound) 4 • Let 𝑞 = 𝑜−1 and set 𝑍 = 𝑓 𝐻 𝑜, 𝑞 𝑜 • Then 𝑍 ∼ Bin 2 , 𝑞 𝑜 • ⇒ 𝔽 𝑍 = 2 𝑞 = 2𝑜 𝑜 • ⇒ Var 𝑍 = 2 𝑞 1 − 𝑞 < 2𝑜 • ∴ Var 𝑍 = 𝑝 𝔽 𝑍 2 • Chebyshev: ℙ 𝑍 < 𝑜 → 0 • ℙ 𝐻 𝑜, 𝑞 has a cycle ≥ ℙ 𝑓 𝐻 𝑜, 𝑞 ≥ 𝑜 → 1 ∎

  16. Existence of Thresholds Theorem 3.2.4 (Bollobás-Thomason, 1987) Every nontrivial monotone graph property has a threshold. Proof (upper bound) 1 • Let 𝑞 𝑜 = 𝑞 0 be such that ℙ 𝐻 𝑜, 𝑞 0 ∈ 𝒬 = 2 • Let 𝐻 ∼ 𝐻 1 ∪ 𝐻 2 ∪ ⋯ ∪ 𝐻 𝑛 , where each 𝐻 𝑗 ∼ 𝐻 𝑜, 𝑞 0 is independent • ⇒ 𝐻 ∼ 𝐻 𝑜, 𝑞 for 𝑞 ≔ 1 − 1 − 𝑞 0 𝑛 ≤ 𝑛𝑞 0 • Property is monotone: • ℙ 𝐻 ∈ 𝒬 ≥ ℙ ∪ 𝑗 𝐻 𝑗 ∈ 𝒬 = 1 − ℙ ∩ 𝑗 𝐻 𝑗 ∉ 𝒬 • Graphs are independent: • ℙ ∩ 𝑗 𝐻 𝑗 ∉ 𝒬 = ς 𝑗 ℙ 𝐻 𝑗 ∉ 𝒬 1 • Since 𝐻 𝑗 ∼ 𝐻(𝑜, 𝑞 0 ) , ℙ 𝐻 𝑗 ∉ 𝒬 = 2 • ∴ ℙ 𝐻 ∈ 𝒬 ≥ 1 − 2 −𝑛 → 1 if 𝑛 → ∞ (or if 𝑞 ≫ 𝑞 0 ) ∎

  17. Below the Threshold Theorem 3.2.4 (Bollobás-Thomason, 1987) Every nontrivial monotone graph property has a threshold. Proof (lower bound) 𝑞 0 • Let 𝐻 ∼ 𝐻 1 ∪ 𝐻 2 ∪ ⋯ ∪ 𝐻 𝑛 as before, but with 𝐻 𝑗 ∼ 𝐻 𝑜, 𝑞 for 𝑞 = 𝑛 • ⇒ 𝐻 ∼ 𝐻(𝑜, 𝑟) for 𝑟 = 1 − 1 − 𝑞 𝑛 ≤ 𝑛𝑞 = 𝑞 0 1 • ⇒ ℙ 𝐻 ∉ 𝒬 ≥ 2 • As before, ℙ 𝐻 ∉ 𝒬 ≤ ℙ 𝐻 𝑜, 𝑞 ∉ 𝒬 𝑛 1/m 1 • ⇒ ℙ 𝐻 𝑜, 𝑞 ∉ 𝒬 ≥ 2 1/𝑛 1 • ⇒ ℙ 𝐻 𝑜, 𝑞 ∈ 𝒬 ≤ 1 − → 0 if 𝑛 → ∞ (or if 𝑞 ≪ 𝑞 0 ) ∎ 2

  18. Closing Remarks Random graph theory • Fundamental problem: given a graph property 𝒬 , what is its threshold? At the threshold • We showed what happens for probabilities much smaller than the threshold, and much larger than the threshold • What if 𝑞 = Θ 𝑞 0 𝑜 ? Some properties have a much quicker transition Definition 3.2.5 (Sharp thresholds) We say 𝑞 0 (𝑜) is a sharp threshold for 𝒬 if there are positive constants 𝑑 1 , 𝑑 2 such that ℙ 𝐻 𝑜, 𝑞 ∈ 𝒬 → ቊ0 if 𝑞 ≤ 𝑑 1 𝑞 0 𝑜 , 1 if 𝑞 ≥ 𝑑 2 𝑞 0 𝑜 .

  19. Any questions?

  20. §3 Subgraphs of 𝐻 𝑜, 𝑞 Chapter 3: The Second Moment The Probabilistic Method

  21. Returning to Ramsey Theorem 1.5.7 Given ℓ, 𝑙, 𝑜 ∈ ℕ and 𝑞 ∈ 0,1 , if 𝑜 2 + 𝑜 ℓ 𝑙 2 < 1, ℓ 𝑞 1 − 𝑞 𝑙 then 𝑆 ℓ, 𝑙 > 𝑜 . Choosing parameters • Want to choose 𝑜 as large as possible • Need to avoid large independent sets • ⇒ would like to make edge probability 𝑞 large • Limitation: need to avoid 𝐿 ℓ Question: What is the threshold for K ℓ ⊆ 𝐻(𝑜, 𝑞) ?

  22. A Lower Bound Goal • Let 𝑌 count the number of 𝐿 ℓ in 𝐻(𝑜, 𝑞) • For which 𝑞 do we have ℙ 𝑌 ≥ 1 = 𝑝(1) ? First moment ℓ ℓ 𝑜 2 = Θ 𝑜 ℓ 𝑞 • 𝔽 𝑌 = ℓ 𝑞 2 • Markov’s Inequality: ℙ 𝑌 ≥ 1 ≤ 𝔽 𝑌 Threshold bound ℓ • 𝔽 𝑌 = Θ 𝑜 ℓ 𝑞 ≪ 1 2 ℓ 2 ≪ 𝑜 −ℓ ⇔ 𝑞 ≪ 𝑜 −2/(ℓ−1) • ⇔ 𝑞 • ⇒ 𝑞 0 (𝑜) ≥ 𝑜 −2/(ℓ−1)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend