lecture 4 goemans williamson algorithm for max cut
play

Lecture 4: Goemans-Williamson Algorithm for MAX-CUT Lecture Outline - PowerPoint PPT Presentation

Lecture 4: Goemans-Williamson Algorithm for MAX-CUT Lecture Outline Part I: Analyzing semidefinite programs Part II: Analyzing Goemans-Williamson Part III: Tight examples for Goemans-Williamson Part IV: Impressiveness of


  1. Lecture 4: Goemans-Williamson Algorithm for MAX-CUT

  2. Lecture Outline • Part I: Analyzing semidefinite programs • Part II: Analyzing Goemans-Williamson • Part III: Tight examples for Goemans-Williamson • Part IV: Impressiveness of Goemans-Williamson and open problems

  3. Part I: Analyzing semidefinite programs

  4. Goemans-Williamson Program • Recall Goemans-Williamson program: Maximize 1−𝑁 𝑗𝑘 σ 𝑗,𝑘:𝑗<𝑘, 𝑗,𝑘 ∈𝐹(𝐻) subject to M ≽ 0 where 2 𝑁 ≽ 0 and ∀𝑗, 𝑁 𝑗𝑗 = 1 • Theorem: Goemans-Williamson gives a .878 approximation for MAX-CUT • How do we analyze Goemans-Williamson and other semidefinite programs?

  5. Vector Solutions • Want: matrix 𝑁 such that 𝑁 𝑗𝑘 = 𝑦 𝑗 𝑦 𝑘 where 𝑦 𝑗 are the problem variables. • Semidefinite program: Assigns a vector 𝑤 𝑗 to each 𝑦 𝑗 , gives the matrix 𝑁 where 𝑁 𝑗𝑘 = 𝑤 𝑗 ⋅ 𝑤 𝑘 • Note: This is a relaxation of the problem. To obtain an actual solution, we need a rounding algorithm to round this vector solution into an actual solution.

  6. Vector Solution Justification • Theorem: 𝑁 ≽ 0 if and only if there are vectors 𝑤 𝑗 such that 𝑁 𝑗𝑘 = 𝑤 𝑗 ⋅ 𝑤 𝑘 𝑤 1 = 1,0,0 1 −1 1 • Example: 𝑁 = 𝑤 2 = −1,1,0 , −1 2 −1 1 −1 2 𝑤 3 = 1,0,1 • One way to see this: take a “square root” of 𝑁 • Second way to see this: Cholesky decomposition

  7. Square Root of a PSD Matrix • If there are vectors {𝑤 𝑗 } such that 𝑁 𝑗𝑘 = 𝑤 𝑗 ⋅ 𝑤 𝑘 , take 𝑊 to be the matrix with rows 𝑤 1 , ⋯ , 𝑤 𝑜 . 𝑁 = 𝑊𝑊 𝑈 ≽ 0 𝑜 𝑈 • Conversely, if 𝑁 ≽ 0 then 𝑁 = σ 𝑗=1 𝜇 𝑗 𝑣 𝑗 𝑣 𝑗 where 𝜇 𝑗 ≥ 0 for all 𝑗 . Taking 𝑊 to be the matrix with columns 𝜇 𝑗 𝑣 𝑗 , 𝑊𝑊 𝑈 = 𝑁 . Taking 𝑤 𝑗 to be the ith row of 𝑊 , 𝑁 𝑗𝑘 = 𝑤 𝑗 ⋅ 𝑤 𝑘

  8. Cholesky Decomposition • Cholesky decomposition: 𝑁 = 𝐷𝐷 𝑈 where 𝐷 is a lower triangular matrix. • 𝑤 𝑗 = σ 𝑏 𝐷 𝑗𝑏 𝑓 𝑏 is the ith row of 𝐷 • We can find the entries of 𝐷 one by one.

  9. Cholesky Decomposition Example 1 −1 1 • Example: 𝑁 = −1 2 −1 1 −1 2 • 𝑤 1 = 1,0,0 • Need 𝐷 21 = −1 so that 𝑤 2 ⋅ 𝑤 1 = −1 . 𝑤 2 = −1, 𝐷 22 , 0 • Taking 𝐷 22 = 1 , 𝑤 2 ⋅ 𝑤 2 = 2 . 𝑤 2 = −1,1,0 • Need 𝐷 31 = 1 and 𝐷 32 = 0 so that 𝑤 3 ⋅ 𝑤 1 = 1, 𝑤 3 ⋅ 𝑤 2 = −1 . 𝑤 3 = 1,0, 𝐷 33 . • Taking 𝐷 33 = 1 , 𝑤 3 ⋅ 𝑤 3 = 1 . 𝑤 3 = 1,0,1

  10. Cholesky Decomposition Example 1 0 0 1 −1 1 1 −1 1 • = −1 2 −1 −1 1 0 0 1 0 1 0 1 0 0 1 1 −1 2 1 −1 1 • 𝑤 1 = , 𝑤 2 = , 𝑤 3 = 0 1 0 0 0 1

  11. Cholesky Decomposition Formulas 𝑗−1 𝐷 𝑙𝑏 𝐷 𝑗𝑏 • ∀𝑗 < 𝑙, take 𝐷 𝑙𝑗 = 𝑁 𝑗𝑙 −σ 𝑏=1 𝑑 𝑗𝑗 𝑗−1 𝐷 𝑙𝑏 𝐷 𝑗𝑏 = 𝐷 𝑗𝑗 = 0 • Take 𝐷 𝑙𝑗 = 0 if 𝑁 𝑗𝑙 − σ 𝑏=1 𝑗−1 𝐷 𝑙𝑏 𝐷 𝑗𝑏 + 𝐷 𝑙𝑗 𝐷 𝑗𝑗 = 𝑁 𝑗𝑙 • Note that 𝑤 𝑙 ⋅ 𝑤 𝑗 = σ 𝑏=1 𝑙−1 𝐷 𝑙𝑏 2 • ∀𝑙 , take 𝐷 𝑙𝑙 = 𝑁 𝑙𝑙 − σ 𝑏=1 • These formulas are the basis for the Cholesky- Banachiewicz algorithm and the Cholesky-Crout algorithm (these algorithms only differ in the order the entries are evaluated)

  12. Cholesky Decomposition Failure 𝑗−1 𝐷 𝑙𝑏 𝐷 𝑗𝑏 𝑁 𝑗𝑙 −σ 𝑏=1 1. ∀𝑗 < 𝑙, 𝐷 𝑙𝑗 = 𝐷 𝑗𝑗 𝑙−1 𝐷 𝑙𝑏 2 𝑁 𝑙𝑙 − σ 𝑏=1 2. ∀𝑙, 𝐷 𝑙𝑙 = • If the Cholesky decomposition succeeds, it gives us vectors 𝑤 𝑗 such that 𝑁 𝑗𝑘 = 𝑤 𝑗 ⋅ 𝑤 𝑘 • The formulas can fail in two ways: 2 < 0 for some 𝑙 𝑙−1 𝐷 𝑙𝑏 𝑁 𝑙𝑙 − σ 𝑏=1 1. 𝑗−1 𝐷 𝑙𝑏 𝐷 𝑗𝑏 ≠ 0 for some 𝑗, 𝑙 𝐷 𝑗𝑗 = 0 and 𝑁 𝑗𝑙 − σ 𝑏=1 2. • Failure implies 𝑁 is not PSD (see problem set)

  13. Part II: Analyzing Goemans- Williamson

  14. Vectors for Goemans-Williamson • Goemans-Williamson: Maximize 1−𝑁 𝑗𝑘 σ 𝑗,𝑘:𝑗<𝑘, 𝑗,𝑘 ∈𝐹(𝐻) subject to M ≽ 0 where 2 𝑁 ≽ 0 and ∀𝑗, 𝑁 𝑗𝑗 = 1 • Semidefinite program gives us vectors 𝑤 𝑗 where 𝑤 𝑗 ⋅ 𝑤 𝑘 = 𝑁 𝑗𝑘 1 𝑤 1 𝑤 3 5 2 G 𝑤 4 𝑤 5 𝑤 2 4 3

  15. Rounding Vectors • Beautiful idea: Map each vector 𝑤 𝑗 to ±1 by taking a random vector 𝑥 and setting 𝑦 𝑗 = 1 if 𝑥 ⋅ 𝑤 𝑗 > 0 and setting 𝑦 𝑗 = −1 if 𝑥 ⋅ 𝑤 𝑗 < 0 • Example: 1 𝑥 𝑤 1 𝑤 3 5 2 G 𝑤 4 𝑤 5 𝑤 2 4 3 𝑦 1 = 𝑦 4 = 1, 𝑦 2 = 𝑦 3 = 𝑦 5 = −1

  16. Expected Cut Value 1−𝑦 𝑗 𝑦 𝑘 • Consider 𝐹 σ 𝑗,𝑘:𝑗<𝑘, 𝑗,𝑘 ∈𝐹 𝐻 2 • For each 𝑗, 𝑘 such that 𝑗 < 𝑘, 𝑗, 𝑘 ∈ 𝐹(𝐻) , 1−𝑦 𝑗 𝑦 𝑘 = Θ 𝐹 𝜌 where Θ ∈ [0, 𝜌] is the angle 2 between 𝑤 𝑗 and 𝑤 𝑘 1−𝑁 𝑗𝑘 1−𝑑𝑝𝑡Θ • On the other hand = 2 2

  17. Approximation Factor • Goemens-Williamson gives a cut with expected value at least Θ 1−𝐹 𝑗𝑘 𝜌 σ 𝑗,𝑘:𝑗<𝑘, 𝑗,𝑘 ∈𝐹 𝐻 min 1−𝑑𝑝𝑡Θ 2 Θ 2 • The first term is ≈ .878 at Θ 𝑑𝑠𝑗𝑢 ≈ 134° 1−𝐹 𝑗𝑘 σ 𝑗,𝑘:𝑗<𝑘, 𝑗,𝑘 ∈𝐹 𝐻 is an upper bound on the 2 max cut size, so we have a .878 approximation.

  18. Part III: Tight Examples

  19. Showing Tightness • How can we show this analysis is tight? • We give two examples where we obtain a cut of 1−𝐹 𝑗𝑘 value ≈ .878 σ 𝑗,𝑘:𝑗<𝑘, 𝑗,𝑘 ∈𝐹 𝐻 2 1−𝐹 𝑗𝑘 • In one example, σ 𝑗,𝑘:𝑗<𝑘, 𝑗,𝑘 ∈𝐹 𝐻 is the 2 value of the maximum cut. In the other 1−𝐹 𝑗𝑘 example, .878 σ 𝑗,𝑘:𝑗<𝑘, 𝑗,𝑘 ∈𝐹 𝐻 is the value 2 of the maximum cut.

  20. Example 1: Hypercube • Have one vertex for each point 𝑦 𝑗 ∈ {±1} 𝑜 • We have an edge between 𝑦 𝑗 and 𝑦 𝑘 in 𝐻 if 𝑦 𝑗 ⋅𝑦 𝑘 cos −1 − Θ 𝑑𝑠𝑗𝑢 < 𝜀 𝑜 for an arbitrarily small 𝜀 > 0 1−cos Θ 𝑑𝑠𝑗𝑢 • Goemans-Williamson value ≈ 𝐹(𝐻) 2 • This is achieved by the coordinate cuts. • Goemans-Williamson rounds to a random cut which gives value ≈ Θ 𝑑𝑠𝑗𝑢 𝜌 𝐹(𝐻)

  21. Example 2: Sphere • Take a large number of random points {𝑦 𝑗 } on the unit sphere • We have an edge between 𝑦 𝑗 and 𝑦 𝑘 in 𝐻 if cos −1 𝑦 𝑗 ⋅ 𝑦 𝑘 − Θ 𝑑𝑠𝑗𝑢 < 𝜀 for an arbitrarily small 𝜀 > 0 • Goemans-Williamson value ≈ 1−cos Θ 𝑑𝑠𝑗𝑢 𝐹(𝐻) 2 • A random hyperplane cut gives value ≈ Θ 𝑑𝑠𝑗𝑢 𝜌 𝐹(𝐻) and this is essentially optimal.

  22. Proof requirements • How can we prove the above examples behave as claimed? • For the hypercube, have to upper bound the value of the Goemans-Williamson program. • This can be done by determining the eigenvalues of the hypercube graph and using this to analyze the dual (see problem set) • For the sphere, have to prove that no cut does better than a random hyperplane cut (this is hard, see Feige-Schechtman [FS02])

  23. Part IV: Impressiveness of Goemans- Williamson and Open Problems

  24. Failure of Linear Programming • Trivial algorithm: Randomly guess which side of the cut each vertex is on. 1 • Gives approximation factor 2 • Linear programming doesn’t do any better, not even polynomial sized linear programming extensions [CLRS13]!

  25. Hardness of beating GW • Only know NP-hardness for a 16 17 approximation [Hås01], [TSSW00] • Unique-Games hard to beat Goemans- Williamson on MAX-CUT [KKMO07]

  26. Open problems • Can we find a subexponential time algorithm beating Goemans-Williamson on max cut? • Can we prove constant degree SOS lower bounds for obtaining a better approximation than Goemans-Williamson?

  27. References • [CLRS13] S. Chan, J. Lee, P. Raghavendra, and D. Steurer. Approximate constraint satisfaction requires large lp relaxations. FOCS 2013. • [FS02] U. Feige and G. Schechtman. On the optimality of the random hyperplane rounding technique for max cut. Random Structures & Algorithms - Probabilistic methods in combinatorial optimization, 20 (3), p. 403 – 440. 2002 • [GW95] M. X. Goemans and D. P. Williamson. Improved Approximation Algorithms for Maximum Cut and Satisfiability Problems Using Semidefinite Programming. J. ACM, 42(6):1115-1145, 1995. • [Hås01] J. Håstad. Some optimal inapproximability results. JACM 48: p.798-869, 2001. • [KKMO07] S. Khot , G. Kindler, E. Mossell, R. O’Donnell. Optimal Inapproximability Results for MAX-CUT and Other 2-Variable CSPs? SIAM Journal on Computing, 37 (1): p. 319-357, 2007. • [TSSW00] L. Trevisan, G. Sorkin, M. Sudan, and D. Williamson. Gadgets, approximation, and linear programming. SIAM Journal on Computing, 29(6): p. 2074-2097, 2000.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend