advanced algorithms x
play

Advanced Algorithms (X) Shanghai Jiao Tong University Chihao Zhang - PowerPoint PPT Presentation

Advanced Algorithms (X) Shanghai Jiao Tong University Chihao Zhang May 11, 2020 Estimate One can design a Monte-Carlo algorithm to estimate the value of X i [ 1,1] [ 1,1] n 1 [ X i 1] Z n = i =1 X


  1. Advanced Algorithms (X) Shanghai Jiao Tong University Chihao Zhang May 11, 2020

  2. Estimate π One can design a Monte-Carlo algorithm to estimate the value of π • X i ∈ [ − 1,1] × [ − 1,1] n ∑ 1 [ ∥ X i ∥ ≤ 1] Z n = • i =1

  3. X i ∼ Ber ( 4 ) , π E [ Z n ] = π 4 ⋅ n Therefore, by Chernoff bound Pr [ Z n − π 4 ⋅ n ] ≤ 2 exp ( − ε 2 π n 12 ) ≥ ε ⋅ π 4 ⋅ n n ≥ 12 ε 2 π log 2 If , we have an approximation 1 ± ε δ of with probability at least π 1 − δ

  4. Rejection Sampling The method is often called rejection sampling It is useful to estimate the size of some good sets in a large set The number of samples is | A | B proportional to A | B |

  5. Counting DNF ℓ i ⋀ A DNF formula , φ = C 1 ∨ C 2 ∨ ⋯ ∨ C m C i = x ij j =1 B = satisfying assignments A = all assignments may contain only polynomial many solutions φ The Monte Carlo method using rejection sampling is slow!

  6. For each clause , define the set C i S i := the set of assignments satisfying C i ⋃ We want to estimate S i 1 ≤ i ≤ m B = ⋃ S i 1 ≤ i ≤ m ⋅ ⋃ A = S i 1 ≤ i ≤ m (disjoint union)

  7. How about CNF? We consider a very special case: monotone 2-CNF φ = ( x ∨ y ) ∧ ( x ∨ z ) ∧ ( x ∨ w ) ∧ ( y ∨ w ) x = 𝚞𝚜𝚟𝚏 , y = 𝚐𝚋𝚖𝚝𝚏 y y x x z = 𝚐𝚋𝚖𝚝𝚏 , w = 𝚞𝚜𝚟𝚏 w w z z # φ = # of independent sets

  8. Sampling seems to be harder than DNF case… Rejection sampling is correct but inefficient A natural idea is to resample those violated edges… Unfortunately, this is not correct. x Think about z y

  9. Partial Rejection Sampling Guo, Jerrum and Liu (JACM, 2019) proposed the following fix: “Resample violated vertices and their neighbors” We will prove the correctness and analyze its efficiency next week

  10. From Sampling to Counting

  11. We will show that, in many cases, if one can sample from a space, then he can also estimate the size of the space Consider independent sets again , G = ( V , E ) E = { e 1 , e 2 , …, e m } We want to estimate , the number of i.s. in I ( G ) G

  12. Define G 0 = G , G i = G i − 1 − e i | I ( G ) | = | I ( G 0 ) | = | I ( G 0 ) | | I ( G 1 ) | ⋅ | I ( G 1 ) | | I ( G 2 ) | … | I ( G m − 1 ) | ⋅ | I ( G m ) | | I ( G m ) | || 2 n A = I ( G i ) B = I ( G i +1 ) | A | I ( G i ) can’t be too large! I ( G i +1 ) ≤ 2 | B |

  13. From Counting to Sampling

  14. On the other hand, one can consecutively sample each vertex as long as is known Pr[ v ∈ I ] The value can be obtained via a counting oracle The above two reductions require the system to satisfy “self-reducible” property

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend