advanced algorithms x
play

Advanced Algorithms (X) Shanghai Jiao Tong University Chihao Zhang - PowerPoint PPT Presentation

Advanced Algorithms (X) Shanghai Jiao Tong University Chihao Zhang May 11, 2020 Estimate Estimate One can design a Monte-Carlo algorithm to estimate the value of Estimate One can design a Monte-Carlo algorithm to estimate the value


  1. Advanced Algorithms (X) Shanghai Jiao Tong University Chihao Zhang May 11, 2020

  2. Estimate π

  3. Estimate π One can design a Monte-Carlo algorithm to estimate the value of π

  4. Estimate π One can design a Monte-Carlo algorithm to estimate the value of π

  5. Estimate π One can design a Monte-Carlo algorithm to estimate the value of π • X i ∈ [ − 1,1] × [ − 1,1] n ∑ 1 [ ∥ X i ∥ ≤ 1] Z n = • i =1

  6. X i ∼ Ber ( 4 ) , π E [ Z n ] = π 4 ⋅ n

  7. X i ∼ Ber ( 4 ) , π E [ Z n ] = π 4 ⋅ n Therefore, by Chernoff bound

  8. X i ∼ Ber ( 4 ) , π E [ Z n ] = π 4 ⋅ n Therefore, by Chernoff bound Pr [ Z n − π 4 ⋅ n ] ≤ 2 exp ( − ε 2 π n 12 ) ≥ ε ⋅ π 4 ⋅ n

  9. X i ∼ Ber ( 4 ) , π E [ Z n ] = π 4 ⋅ n Therefore, by Chernoff bound Pr [ Z n − π 4 ⋅ n ] ≤ 2 exp ( − ε 2 π n 12 ) ≥ ε ⋅ π 4 ⋅ n n ≥ 12 ε 2 π log 2 If , we have an approximation 1 ± ε δ of with probability at least π 1 − δ

  10. Rejection Sampling

  11. Rejection Sampling The method is often called rejection sampling

  12. Rejection Sampling The method is often called rejection sampling It is useful to estimate the size of some good sets in a large set

  13. Rejection Sampling The method is often called rejection sampling It is useful to estimate the size of some good sets in a large set B A

  14. Rejection Sampling The method is often called rejection sampling It is useful to estimate the size of some good sets in a large set The number of samples is | A | B proportional to A | B |

  15. Counting DNF

  16. Counting DNF A DNF formula , φ = C 1 ∨ C 2 ∨ ⋯ ∨ C m

  17. Counting DNF ℓ i ⋀ A DNF formula , φ = C 1 ∨ C 2 ∨ ⋯ ∨ C m C i = x ij j =1

  18. Counting DNF ℓ i ⋀ A DNF formula , φ = C 1 ∨ C 2 ∨ ⋯ ∨ C m C i = x ij j =1

  19. Counting DNF ℓ i ⋀ A DNF formula , φ = C 1 ∨ C 2 ∨ ⋯ ∨ C m C i = x ij j =1 B = satisfying assignments

  20. Counting DNF ℓ i ⋀ A DNF formula , φ = C 1 ∨ C 2 ∨ ⋯ ∨ C m C i = x ij j =1 B = satisfying assignments A = all assignments

  21. Counting DNF ℓ i ⋀ A DNF formula , φ = C 1 ∨ C 2 ∨ ⋯ ∨ C m C i = x ij j =1 B = satisfying assignments A = all assignments may contain only polynomial many solutions φ

  22. Counting DNF ℓ i ⋀ A DNF formula , φ = C 1 ∨ C 2 ∨ ⋯ ∨ C m C i = x ij j =1 B = satisfying assignments A = all assignments may contain only polynomial many solutions φ The Monte Carlo method using rejection sampling is slow!

  23. For each clause , define the set C i

  24. For each clause , define the set C i S i := the set of assignments satisfying C i

  25. For each clause , define the set C i S i := the set of assignments satisfying C i ⋃ We want to estimate S i 1 ≤ i ≤ m

  26. For each clause , define the set C i S i := the set of assignments satisfying C i ⋃ We want to estimate S i 1 ≤ i ≤ m

  27. For each clause , define the set C i S i := the set of assignments satisfying C i ⋃ We want to estimate S i 1 ≤ i ≤ m B = ⋃ S i 1 ≤ i ≤ m

  28. For each clause , define the set C i S i := the set of assignments satisfying C i ⋃ We want to estimate S i 1 ≤ i ≤ m B = ⋃ S i 1 ≤ i ≤ m ⋅ ⋃ A = S i 1 ≤ i ≤ m

  29. For each clause , define the set C i S i := the set of assignments satisfying C i ⋃ We want to estimate S i 1 ≤ i ≤ m B = ⋃ S i 1 ≤ i ≤ m ⋅ ⋃ A = S i 1 ≤ i ≤ m (disjoint union)

  30. How about CNF?

  31. How about CNF? We consider a very special case: monotone 2-CNF

  32. How about CNF? We consider a very special case: monotone 2-CNF φ = ( x ∨ y ) ∧ ( x ∨ z ) ∧ ( x ∨ w ) ∧ ( y ∨ w )

  33. How about CNF? We consider a very special case: monotone 2-CNF φ = ( x ∨ y ) ∧ ( x ∨ z ) ∧ ( x ∨ w ) ∧ ( y ∨ w ) y x w z

  34. How about CNF? We consider a very special case: monotone 2-CNF φ = ( x ∨ y ) ∧ ( x ∨ z ) ∧ ( x ∨ w ) ∧ ( y ∨ w ) x = 𝚞𝚜𝚟𝚏 , y = 𝚐𝚋𝚖𝚝𝚏 y x z = 𝚐𝚋𝚖𝚝𝚏 , w = 𝚞𝚜𝚟𝚏 w z

  35. How about CNF? We consider a very special case: monotone 2-CNF φ = ( x ∨ y ) ∧ ( x ∨ z ) ∧ ( x ∨ w ) ∧ ( y ∨ w ) x = 𝚞𝚜𝚟𝚏 , y = 𝚐𝚋𝚖𝚝𝚏 y y x x z = 𝚐𝚋𝚖𝚝𝚏 , w = 𝚞𝚜𝚟𝚏 w w z z

  36. How about CNF? We consider a very special case: monotone 2-CNF φ = ( x ∨ y ) ∧ ( x ∨ z ) ∧ ( x ∨ w ) ∧ ( y ∨ w ) x = 𝚞𝚜𝚟𝚏 , y = 𝚐𝚋𝚖𝚝𝚏 y y x x z = 𝚐𝚋𝚖𝚝𝚏 , w = 𝚞𝚜𝚟𝚏 w w z z # φ = # of independent sets

  37. Sampling seems to be harder than DNF case…

  38. Sampling seems to be harder than DNF case… Rejection sampling is correct but inefficient

  39. Sampling seems to be harder than DNF case… Rejection sampling is correct but inefficient A natural idea is to resample those violated edges…

  40. Sampling seems to be harder than DNF case… Rejection sampling is correct but inefficient A natural idea is to resample those violated edges… Unfortunately, this is not correct.

  41. Sampling seems to be harder than DNF case… Rejection sampling is correct but inefficient A natural idea is to resample those violated edges… Unfortunately, this is not correct. Think about

  42. Sampling seems to be harder than DNF case… Rejection sampling is correct but inefficient A natural idea is to resample those violated edges… Unfortunately, this is not correct. x Think about z y

  43. Partial Rejection Sampling

  44. Partial Rejection Sampling Guo, Jerrum and Liu (JACM, 2019) proposed the following fix:

  45. Partial Rejection Sampling Guo, Jerrum and Liu (JACM, 2019) proposed the following fix: “Resample violated vertices and their neighbors”

  46. Partial Rejection Sampling Guo, Jerrum and Liu (JACM, 2019) proposed the following fix: “Resample violated vertices and their neighbors” We will prove the correctness and analyze its efficiency next week

  47. From Sampling to Counting

  48. We will show that, in many cases, if one can sample from a space, then he can also estimate the size of the space

  49. We will show that, in many cases, if one can sample from a space, then he can also estimate the size of the space Consider independent sets again

  50. We will show that, in many cases, if one can sample from a space, then he can also estimate the size of the space Consider independent sets again , G = ( V , E ) E = { e 1 , e 2 , …, e m }

  51. We will show that, in many cases, if one can sample from a space, then he can also estimate the size of the space Consider independent sets again , G = ( V , E ) E = { e 1 , e 2 , …, e m } We want to estimate , the number of i.s. in I ( G ) G

  52. Define G 0 = G , G i = G i − 1 − e i

  53. Define G 0 = G , G i = G i − 1 − e i | I ( G ) | = | I ( G 0 ) | = | I ( G 0 ) | | I ( G 1 ) | ⋅ | I ( G 1 ) | | I ( G 2 ) | … | I ( G m − 1 ) | ⋅ | I ( G m ) | | I ( G m ) |

  54. Define G 0 = G , G i = G i − 1 − e i | I ( G ) | = | I ( G 0 ) | = | I ( G 0 ) | | I ( G 1 ) | ⋅ | I ( G 1 ) | | I ( G 2 ) | … | I ( G m − 1 ) | ⋅ | I ( G m ) | | I ( G m ) |

  55. Define G 0 = G , G i = G i − 1 − e i | I ( G ) | = | I ( G 0 ) | = | I ( G 0 ) | | I ( G 1 ) | ⋅ | I ( G 1 ) | | I ( G 2 ) | … | I ( G m − 1 ) | ⋅ | I ( G m ) | | I ( G m ) | ||

  56. Define G 0 = G , G i = G i − 1 − e i | I ( G ) | = | I ( G 0 ) | = | I ( G 0 ) | | I ( G 1 ) | ⋅ | I ( G 1 ) | | I ( G 2 ) | … | I ( G m − 1 ) | ⋅ | I ( G m ) | | I ( G m ) | || 2 n

  57. Define G 0 = G , G i = G i − 1 − e i | I ( G ) | = | I ( G 0 ) | = | I ( G 0 ) | | I ( G 1 ) | ⋅ | I ( G 1 ) | | I ( G 2 ) | … | I ( G m − 1 ) | ⋅ | I ( G m ) | | I ( G m ) | || 2 n

  58. Define G 0 = G , G i = G i − 1 − e i | I ( G ) | = | I ( G 0 ) | = | I ( G 0 ) | | I ( G 1 ) | ⋅ | I ( G 1 ) | | I ( G 2 ) | … | I ( G m − 1 ) | ⋅ | I ( G m ) | | I ( G m ) | || 2 n A = I ( G i )

  59. Define G 0 = G , G i = G i − 1 − e i | I ( G ) | = | I ( G 0 ) | = | I ( G 0 ) | | I ( G 1 ) | ⋅ | I ( G 1 ) | | I ( G 2 ) | … | I ( G m − 1 ) | ⋅ | I ( G m ) | | I ( G m ) | || 2 n A = I ( G i ) B = I ( G i +1 )

  60. Define G 0 = G , G i = G i − 1 − e i | I ( G ) | = | I ( G 0 ) | = | I ( G 0 ) | | I ( G 1 ) | ⋅ | I ( G 1 ) | | I ( G 2 ) | … | I ( G m − 1 ) | ⋅ | I ( G m ) | | I ( G m ) | || 2 n A = I ( G i ) B = I ( G i +1 ) | A | can’t be too large! | B |

  61. Define G 0 = G , G i = G i − 1 − e i | I ( G ) | = | I ( G 0 ) | = | I ( G 0 ) | | I ( G 1 ) | ⋅ | I ( G 1 ) | | I ( G 2 ) | … | I ( G m − 1 ) | ⋅ | I ( G m ) | | I ( G m ) | || 2 n A = I ( G i ) B = I ( G i +1 ) | A | I ( G i ) can’t be too large! I ( G i +1 ) ≤ 2 | B |

  62. From Counting to Sampling

  63. On the other hand, one can consecutively sample each vertex as long as is known Pr[ v ∈ I ]

  64. On the other hand, one can consecutively sample each vertex as long as is known Pr[ v ∈ I ] The value can be obtained via a counting oracle

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend