advanced algorithms vi
play

Advanced Algorithms (VI) Shanghai Jiao Tong University Chihao Zhang - PowerPoint PPT Presentation

Advanced Algorithms (VI) Shanghai Jiao Tong University Chihao Zhang April 13, 2020 Martingale Let be a sequence of random variables { X t } t 0 Let be a sequence of -algebras such that { t } t 0 filtration 0 1


  1. Advanced Algorithms (VI) Shanghai Jiao Tong University Chihao Zhang April 13, 2020

  2. Martingale Let be a sequence of random variables { X t } t ≥ 0 Let be a sequence of -algebras such that { ℱ t } t ≥ 0 σ filtration ℱ 0 ⊆ ℱ 1 ⊆ ℱ 2 ⋯ A martingale is a sequence of pairs s.t. { X t , ℱ t } t ≥ 0 • for all is -measurable t ≥ 0, X t ℱ t • for all , t ≥ 0 E [ X t +1 ∣ ℱ t ] = X t

  3. Stopping Time The stopping time is a random variable τ ∈ ℕ ∪ { ∞ } such that is -measurable for all [ τ ≤ t ] ℱ t t “whether to stop can be determined by looking at the outcomes seen so far” • The first time a gambler wins five games in a row • The last time a gambler wins five games in a row

  4. A basic property of a martingale is { X t , ℱ t } t ≥ 0 for any E [ X t ] = E [ X 0 ] t ≥ 0 Proof. , ∀ t ≥ 1 E [ X t ] = E [ E [ X t ∣ ℱ t − 1 ]] = E [ X t − 1 ] Does hold for a (randomized) E [ X τ ] = E [ X 0 ] stopping time ? τ Not true in general. Assume is the first time a τ gambler wins $100

  5. Optional Stopping Theorem For a stopping time , holds if τ E [ X τ ] = E [ X 0 ] • Pr[ τ < ∞ ] = 1 • E [ | X τ | ] < ∞ • t →∞ E [ X t ⋅ 1 [ τ > t ] ] = 0 lim

  6. The following conditions are stronger, but easier to verify 1. There is a fixed such that a.s. τ ≤ n n 2. and there is a fixed such that Pr[ τ < ∞ ] = 1 M for all | X t | ≤ M t ≤ τ 3. and there is a fixed such that E [ τ ] < ∞ c for all t < τ | X t +1 − X t | ≤ c OST applies when at least one of above holds

  7. Proof of the Optional Stopping Theorem

  8. Applications of OST

  9. Random Walk in 1-D t ∑ Let u.a.r. and Z t ∈ { − 1, + 1} X t = Z i i =1 The random walk stops when it hits or − a < 0 b > 0 Let be the time it stops. is a stopping time τ τ What is ? E [ τ ]

  10. The random walk stops when one of two ends is arrived We first determine , the probability that the walk p a ends at , using OST − a E [ X τ ] = p a ( − a ) + (1 − p a ) b = E [ X 0 ] = 0 b ⟹ p a = a + b

  11. Now define a random variable Y t = X 2 t − t Claim. is a martingale { Y t } t ≥ 0 E [ Y t +1 ∣ ℱ t ] = E [( X t + Z t +1 ) 2 − ( t + 1) ∣ ℱ t ] = E [ X 2 t + 2 Z t +1 X t − t ∣ ℱ t ] = X 2 t − t = Y t

  12. satisfies the condition for Y τ OST, so E [ Y τ ] = E [ X 2 τ ] − E [ τ ] = E [ Y 0 ] = 0 On the other hand, we have τ ] = p a ⋅ a 2 + (1 − p a ) ⋅ b 2 = ab E [ X 2 This implies E [ τ ] = ab

  13. Wald’s Equation E [ X i ] N ∑ Recall in Week two, we consider the sum i where are independent with mean and is a { X i } N μ random variable We are now ready to prove the general case!

  14. t ∑ Assume is finite and let E [ N ] Y t = ( X i − μ ) i =1 is a martingale and the stopping time satisfies { Y t } N the conditions for OST E [ Y N ] = E [ ( X i − μ ) ] = E [ X i ] − E [ μ ] N N N ∑ ∑ ∑ i =1 i =1 i =1 = E [ X i ] − E [ N ] ⋅ μ = 0 N ∑ i =1

  15. Waiting Time for Patterns Fix a pattern “00110” P = How many fair coins one needs to toss to see P for the first time (in expectation)? The number can be calculated using OST Shuo-Yen Robert Li ( 李碩彥 )

  16. Let the pattern P = p 1 p 2 … p k We draw a random string B = b 1 b 2 b 3 … Imagine for each , there is a gambler j ≥ 1 G j At time , bets for “ ”. If he wins, he $1 b j = p 1 j G j bets for “ ”, … $2 b j +1 = p 2 He keeps doubling the money until he loses

  17. The money of is a martingale (w.r.t. ) G j B Let be the money of all gamblers at time X t t is also a martingale { X t } t ≥ 1 Let be the first time that we meet in P B τ and meet the conditions for OST, so { X t } E [ X τ ] = 0 τ

  18. Now we can compute the money of each at G j τ • All gamblers before must lose τ − k + 1 • The gambler 2 k − 1 wins G τ − k +1 • Any other gamblers can win? A gambler wins iff p 1 p 2 … p j = p k − j +1 p k − j +2 … p k G τ − j +1 $2 j − 1 If wins, he wins G τ − j +1

  19. For any and , let be the P = p 1 p 2 … p k 1 ≤ j ≤ k χ j indicator that p 1 … p j = p k − j +1 … p k k k χ j ⋅ (2 j − 1) ∑ ∑ Then X τ = − τ − χ j + j =1 j =1 contribution of losers contribution of winners k ∑ This implies E [ τ ] = χ j ⋅ 2 j j =1

  20. Proof of OST Show on Board Read Chapter 8 of “Notes on Randomized Algorithms” for more details https://arxiv.org/abs/2003.01902

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend