randomness in computing
play

Randomness in Computing L ECTURE 24 Last time Probabilistic method - PowerPoint PPT Presentation

Randomness in Computing L ECTURE 24 Last time Probabilistic method Algorithmic LLL Applications of LLL Today Markov chains 4/21/2020 Sofya Raskhodnikova;Randomness in Computing; based on slides by Baranasuriya et al.


  1. Randomness in Computing L ECTURE 24 Last time • Probabilistic method • Algorithmic LLL • Applications of LLL Today • Markov chains 4/21/2020 Sofya Raskhodnikova;Randomness in Computing; based on slides by Baranasuriya et al.

  2. Drunkard’s walk problem Tipsy 𝒌 + 𝟐 𝒌 − 𝟐 𝒌 𝒐 0 𝑞 𝑘 = Pr[Tipsy goes home he started at position 𝑘 𝑞 𝑜 = 1 𝑞 0 = 0 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  3. Drunkard’s walk: probability Tipsy 𝒌 + 𝟐 𝒌 − 𝟐 𝒌 𝒐 0 𝑞 𝑘 = Pr[Tipsy goes home he started at position 𝑘 𝑞 𝑜 = 1 𝑞 0 = 0 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  4. Drunkard’s walk: probability Tipsy 𝒌 + 𝟐 𝒌 − 𝟐 𝒌 𝒐 0 𝑞 𝑘 = Pr[Tipsy goes home he started at position 𝑘 𝑞 𝑜 = 1 𝑞 0 = 0 𝑞 𝑘−1 𝑞 𝑘+1 for 𝑘 ∈ [1, 𝑜 − 1] : 𝑞 𝑘 = + 2 2 𝒒 𝒌 = 𝒌 𝒐 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  5. Drunkard’s walk: probability Tipsy 𝒌 + 𝟐 𝒌 − 𝟐 𝒌 𝒐 0 he started at position 𝑘 = 𝑘 Pr[Tipsy goes home 𝑜 he started at position 𝑘 = 𝑜 − 𝑘 Pr[Tipsy falls into the river 𝑜 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  6. Drunkard’s walk: expected time Tipsy 𝒌 + 𝟐 𝒌 − 𝟐 𝒌 𝒐 0 𝑡 𝑘 = expected number of steps to finish the walk, starting at postion 𝑘 𝑡 0 = 0 𝑡 𝑜 = 0 𝑡 𝑘−1 𝑡 𝑘+1 for 𝑘 ∈ [1, 𝑜 − 1] : 𝑡 𝑘 = 1 + 2 + 2 𝒕 𝒌 = 𝒌(𝒐 − 𝒌) 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  7. Markov Chains • A (discrete time) stochastic process is a (finite or countably infinite) collection of random variables 𝑌 0 , 𝑌 1 , 𝑌 2 , … represent evolution of some random process over time • A discrete time stochastic process is a Markov chain if ∀𝑢 ≥ 1 and ∀ values 𝑏 0 , 𝑏 1 , … , 𝑏 𝑢 , Pr 𝑌 𝑢 = 𝑏 𝑢 𝑌 𝑢−1 = 𝑏 𝑢−1 , 𝑌 𝑢−2 = 𝑏 𝑢−2 , … , 𝑌 0 = 𝑏 0 = Pr[ 𝑌 𝑢 = 𝑏 𝑢 |𝑌 𝑢−1 = 𝑏 𝑢−1 ] Markov property or memoryless property = 𝑄 Time-homogeneous 𝑏 𝑢−1 ,𝑏 𝑢 property 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  8. Terminology state space the set of values the RVs can take, e.g. 0,1,2, … 𝒀 𝟏 , 𝒀 𝟐 , … states visited by the chain 𝑸 𝒃 𝒖−𝟐 ,𝒃 𝒖 transition probability from 𝒃 𝒖−𝟐 to 𝒃 𝒖 Memoryless property: • 𝑌 𝑢 depends on 𝑌 𝑢−1 , but not on how the process arrived at state 𝑌 𝑢−1 . • It does not imply that 𝑌 𝑢 is independent of 𝑌 0 , … , 𝑌 𝑢−2 (only that this dependency is captured by 𝑌 𝑢−1 ) 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  9. Representation: directed weighted graph • Set of vertices = state space • Directed edge (𝑗, 𝑘) iff 𝑄 𝑗,𝑘 > 0 ; the edge weight is 𝑄 𝑗,𝑘 1 1 4 3 0 1 2 1 1 2 1 1 2 6 3 1 4 4 3 1 4

  10. Representation: Transition Matrix • Entry 𝑄 𝑗,𝑘 in matrix 𝑸 is the transition probability from 𝑗 to 𝑘 • For all rows 𝑗 , the sum σ 𝑘≥0 𝑄 𝑗,𝑘 = 1 1 1 4 3 2 0 1 1   1 3 0 0   4 4 1   1 1 1 2 1 1 0    2 3 6 2 6 P 3 1   0 0 1 0 4 4   1 1 1 3 0   2 4 4 1 4

  11. ҧ ҧ Distribution of states • Let 𝑞 𝑗 𝑢 be the probability that the process is at state 𝑗 at time 𝑢 . By Law of Total Probability, 𝑞 𝑗 𝑢 = ෍ 𝑞 𝑘 𝑢 − 1 ⋅ 𝑄 𝑗,𝑘 𝑘≥0 • Let ҧ 𝑞 𝑢 = (𝑞 0 𝑢 , 𝑞 1 𝑢 , … ) be the vector giving the distribution of the chain at time 𝑢 . 𝑞 𝑢 = 𝑞 𝑢 − 1 𝑸 • For all 𝑛 ≥ 0 , we define the m -step transition probability 𝑛 = Pr 𝑌 𝑢+𝑛 = 𝑘 𝑌 𝑢 = 𝑗] 𝑄 𝑗,𝑘 • Conditioning on the first transition from 𝑗 , by Law of Total Probability, 𝑛 = ෍ 𝑛−1 𝑄 𝑗,𝑘 𝑄 𝑗,𝑙 𝑄 𝑙,𝑘 𝑙≥0 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  12. ҧ ҧ Distribution of states at time 𝒏 𝑛 = ෍ 𝑛−1 𝑄 𝑗,𝑘 𝑄 𝑗,𝑙 𝑄 𝑙,𝑘 𝑙≥0 • Let 𝑸 (𝒏) be the matrix whose entries 𝑗, 𝑘 are the 𝒏 -step 𝑛 . transitional probabilities 𝑄 𝑗,𝑘 𝑸 𝒏 = 𝑸 ⋅ 𝑸 𝑛−1 By induction on 𝑛, 𝑸 𝒏 = 𝑸 𝒏 • For all 𝑢 ≥ 0 and 𝑛 ≥ 1, 𝑞(𝑢)𝑸 𝒏 𝑞 𝑢 + 𝑛 = 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  13. Example What is the probability of going from state 0 to state 3 in exactly three steps? 1 1 4 3 2 0 1 1   3 1 0 0   4 4 1   1 1 1 2 1 1 0    2 3 6 2 6 P 3 1   4 0 0 1 0 4   1 1 1 3 0   2 4 4 1 4

  14. Example What is the probability of going from state 0 to state 3 in exactly three steps? 1 0 3 0-1-0-3 2 1 2 1 1 1 4 3 3 2 3 0-1-3-3 2 0 1 1 0 0 1 1 2 2 1 1 2 6 3 1 3 0-3-1-3 4 4 3 2 2 3 1 1 3 2 4 3 0-3-3-3

  15. Example • We calculate the probability of the four events: 1 1  0 – 1 – 0 – 3 | Pr = 3/32 4 3  0 – 1 – 3 – 3 | Pr = 1/96 0 1 2 1  0 – 3 – 1 – 3 | Pr = 1/16 1 2 1 1  0 – 3 – 3 – 3 | Pr = 3/64 2 6 3 1 4 4 3 • Since they are mutually exclusive, the total probability is 1 4 Pr = 3/32 + 1/96 + 1/16 + 3/64 = 41/192 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  16. Example Alternatively, we can calculate 𝑸 3 and find the entry (0,3)   3 7 29 41   16 48 64 192   5 5 79 5   48 24 144 36   0 0 1 0     1 13 107 47   16 96 192 192

  17. Example 2 What is the probability of ending up in state 3 after three steps if we start in a uniformly random state? 1 1 4 3 0 1 2 1   3 1 0 0   4 4 1   1 1 1 2 1 1 0    2 3 6 2 6 P 3 1   0 0 1 0 4 4   1 1 1 3 0   2 4 4 Solution: 1 • Calculate 4 Answer 4 , 1 1 4 , 1 4 , 1 192 , 47 17 384 , 737 1152 , 43 4 𝑄 3 = 288 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  18. Application: Algorithm for 2SAT Recall: A 2CNF formula is an AND of clauses • Each clause is an OR of literals. • Each literal is a Boolean variable or its negation. • E.g. 𝑦 1 ∨ 𝑦 2 ∧ 𝑦 2 ∨ 𝑦 3 ∧ 𝑦 3 ∨ 𝑦 4 ∧ 𝑦 1 ∨ 𝑦 4 ∧ 𝑦 2 ∨ 𝑦 4 2SAT Problem (search version): Given a 2CNF formula, find a satisfying assignment if it is satisfiable. 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  19. Randomized Algorithm for 2SAT Input: a 2 CNF formula 𝜚 on 𝑜 variables parameter 1. Start with an arbitrary truth assignment, e.g., all 0’s. 2. Repeat R times, terminating if 𝜚 is satisfied: a) Choose an arbitrary clause 𝐷 that is not satisfied. b) Pick a uniformly random literal in 𝐷 and flip its assignment. 3. If a satisfying assignment is found, return it. 4. Otherwise, return ``unsatisfiable ’’. Example: 𝜚 = 𝑦 1 ∨ 𝑦 2 ∧ 𝑦 2 ∨ 𝑦 3 ∧ 𝑦 3 ∨ 𝑦 4 ∧ 𝑦 1 ∨ 𝑦 4 ∧ 𝑦 2 ∨ 𝑦 4 • Initial assignment: 𝑦 1 = 0, 𝑦 2 = 0, 𝑦 3 = 0, 𝑦 4 = 0 • Unsatisfied clause: C = 𝑦 1 ∨ 𝑦 4 • Pick 𝑦 1 or 𝑦 4 and flip its value: 𝑦 1 = 0, 𝑦 2 = 0, 𝑦 3 = 0, 𝑦 4 = 1 • New unsatisfied clause: C = 𝑦 3 ∨ 𝑦 4 • Pick 𝑦 3 or 𝑦 4 and flip its value: 𝑦 1 = 0, 𝑦 2 = 0, 𝑦 3 = 1, 𝑦 4 = 1 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  20. When can the algorithm fail? • Only if 𝜚 is satisfiable, but we did not find a satisfying assignment in 𝑆 iterations (steps). • We will analyze the number of steps necessary. • Each step can be implemented to run in 𝑃(𝑜 2 ) time, since there are 𝑃 𝑜 2 clauses. 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  21. Analysis of the number of steps • Let 𝑇 = a satisfying assignment of 𝜚. • 𝐵 𝑗 = an assignment to 𝜚 after 𝑗 steps • 𝑌 𝑗 = number of variables that have the same value in 𝐵 𝑗 and 𝑇 When 𝑌 𝑗 = 𝑜 , the algorithm terminates with a satisfying assignment. (It could do it before 𝑌 𝑗 = 𝑜 if it finds another satisfying assignment.) • If 𝑌 𝑗 = 0 then 𝑌 𝑗+1 = 1 Pr 𝑌 𝑗+1 = 1 𝑌 𝑗 = 0 = 1 • If 𝑌 𝑗 ∈ [1, 𝑜 − 1] then 𝐵 𝑗 disagrees with 𝑇 on 1 or 2 literals of 𝐷 Pr 𝑌 𝑗+1 = 𝑘 + 1 𝑌 𝑗 = 𝑘 ≥ 1/2 1/2 or 1 Pr 𝑌 𝑗+1 = 𝑘 − 1 𝑌 𝑗 = 𝑘 ≤ 1/2 𝑌 0 , 𝑌 1 , 𝑌 2 , … is not necessarily a Markov chain, since the probability of 𝑌 𝑗+1 > 𝑌_𝑗 depends on whether 𝐵 𝑗 and 𝑇 disagree on 1 or 2 literals of 𝐷 (which could depend on previous choices, not just 𝑌 𝑗 ) 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

  22. Creating a true Markov chain • Define a Markov Chain 𝑍 0 , 𝑍 1 , 𝑍 2 , … 𝑍 0 = 𝑌 0 Pr 𝑍 𝑗+1 = 1 𝑍 𝑗 = 0 = 1 Pr 𝑍 𝑗+1 = 𝑘 + 1 𝑍 𝑗 = 𝑘 = 1/2 Pr 𝑍 𝑗+1 = 𝑘 − 1 𝑍 𝑗 = 𝑘 = 1/2 • ``Pessimistic version’’ of stochastic process 𝑌 0 , 𝑌 1 , 𝑌 2 , … The expected time to reach 𝑜 is larger for 𝑍 0 , 𝑍 1 , 𝑍 2 , … than for 𝑌 0 , 𝑌 1 , 𝑌 2 , … 4/21/2020 Sofya Raskhodnikova; Randomness in Computing

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend