graph partitioning
play

Graph partitioning Prof. Richard Vuduc Georgia Institute of - PowerPoint PPT Presentation

Graph partitioning Prof. Richard Vuduc Georgia Institute of Technology CSE/CS 8803 PNA: Parallel Numerical Algorithms [L.27] Tuesday, April 22, 2008 1 Todays sources CS 194/267 at UCB (Yelick/Demmel) Intro to parallel computing by


  1. Graph partitioning Prof. Richard Vuduc Georgia Institute of Technology CSE/CS 8803 PNA: Parallel Numerical Algorithms [L.27] Tuesday, April 22, 2008 1

  2. Today’s sources CS 194/267 at UCB (Yelick/Demmel) “Intro to parallel computing” by Grama, Gupta, Karypis, & Kumar 2

  3. Review: Dynamic load balancing 3

  4. Parallel efficiency: 4 scenarios Consider load balance , concurrency , and overhead 4

  5. Summary Unpredictable loads → online algorithms Fixed set of tasks with unknown costs → self-scheduling Dynamically unfolding set of tasks → work stealing Theory ⇒ randomized should work well Other scenarios: What if… … locality is of paramount importance? ⇒ Diffusion-based models? … processors are heterogeneous? ⇒ Weighted factoring? … task graph is known in advance? ⇒ Static case; graph partitioning (today) 5

  6. Graph partitioning 6

  7. Problem definition Weighted graph c:1 G = ( V, E, W V , W E ) 2 1 1 4 d:3 Find partitioning of nodes s.t.: a:2 b:2 2 3 2 Sum of node-weights ~ even 5 f:2 e:1 Sum of inter-partition edge- 1 weights minimized 6 g:3 h:1 V = V 1 ∪ V 2 ∪ · · · ∪ V p 7

  8. c:1 2 1 1 4 d:3 a:2 b:2 2 3 2 5 f:2 e:1 1 6 g:3 h:1 8

  9. c:1 2 1 1 4 d:3 a:2 b:2 2 3 2 5 f:2 e:1 1 6 g:3 h:1 9

  10. 10

  11. Cost of graph partitioning Many possible partitions Consider V = V 1 U V 2 � � n � 2 π n · 2 n ≈ n 2 Problem is NP-Complete, so need heuristics 11

  12. First heuristic: Repeated graph bisection To get 2 k partitions, bisect k times 12

  13. Edge vs. vertex separators Edge separator : E s ⊂ E, s.t. removal creates two disconnected components Vertex separator : V s ⊂ V, s.t. removing Vs and its incident edges creates two disconnected components E s → V s : | V s | | E s | ≤ V s V s → E s : | E s | d · | V s | , ≤ d = max degree E s E s 13

  14. Overview of bisection heuristics With nodal coordinates: Spatial partitioning Without nodal coordinates Multilevel acceleration: Use coarse graphs 14

  15. Partitioning with nodal coordinates 15

  16. Intuition: Planar graph theory Planar graph : Can draw G in the plane w/o edge crossings Theorem (Lipton & Tarjan ’79): Planar G ⇒ ∃ V s s.t. (1) V = V 1 ∪ V s ∪ V 2 | V 1 | , | V 2 | ≤ 2 V s (2) 3 | V | � (3) | V s | ≤ 8 | V | 16

  17. Inertial partitioning 17

  18. Inertial partitioning Choose line L L 18

  19. Inertial partitioning Choose line L L 19

  20. Inertial partitioning Choose line L L 20

  21. Inertial partitioning Choose line L L (¯ x, ¯ y ) ( a, b ) 21

  22. Inertial partitioning Choose line L L : a · ( x − ¯ x ) + b · ( y − ¯ y ) = 0 L a 2 + b 2 = 1 (¯ x, ¯ y ) ( a, b ) 22

  23. Inertial partitioning Choose line L L : a · ( x − ¯ x ) + b · ( y − ¯ y ) = 0 L a 2 + b 2 = 1 Project points onto L (¯ x, ¯ y ) ( a, b ) 23

  24. Inertial partitioning Choose line L ( x k , y k ) L : a · ( x − ¯ x ) + b · ( y − ¯ y ) = 0 L a 2 + b 2 = 1 s k Project points onto L (¯ x, ¯ y ) ( a, b ) 24

  25. Inertial partitioning Choose line L ( x k , y k ) L : a · ( x − ¯ x ) + b · ( y − ¯ y ) = 0 L a 2 + b 2 = 1 s k Project points onto L (¯ x, ¯ y ) s k = − b · ( x k − ¯ x ) + a · ( y k − ¯ y ) ( a, b ) 25

  26. Inertial partitioning Choose line L ( x k , y k ) L : a · ( x − ¯ x ) + b · ( y − ¯ y ) = 0 L a 2 + b 2 = 1 s k Project points onto L (¯ x, ¯ y ) s k = − b · ( x k − ¯ x ) + a · ( y k − ¯ y ) ( a, b ) Compute median and separate s = median( s 1 , . . . , s n ) ¯ 26

  27. How to choose L? N 1 N 2 L L N 1 N 2 27

  28. How to choose L? ( x k , y k ) L s k Least-squares fit: Minimize sum-of-square distances (¯ x, ¯ y ) � ( d k ) 2 ( a, b ) k x ) 2 + ( y k − ¯ � y ) 2 − ( s k ) 2 � � = ( x k − ¯ k x ) 2 + ( y k − ¯ � y ) 2 − ( − b ( x k − ¯ y )) 2 � � = ( x k − ¯ x ) + a ( y k − ¯ k y ) 2 + 2 ab a 2 � � y ) + b 2 � x ) 2 = ( y k − ¯ ( x k − ¯ x )( y k − ¯ ( x k − ¯ k k k a 2 · α 1 + 2 ab · α 2 + b 2 · α 3 = � � � � a α 1 α 2 b ) · = ( a · b α 2 α 3 28

  29. How to choose L? ( x k , y k ) L s k Least-squares fit: Minimize sum-of-square distances (¯ x, ¯ y ) Interpretation: ( a, b ) Equivalent to choosing L as axis of rotation that minimizes moment of inertia. � � a � ( d k ) 2 Minimize: = ( a b ) · A (¯ x, ¯ y ) · b k 1 � = ¯ = x x k ⇒ n k 1 � ¯ = y y k n k � � a = Eigenvector of smallest eigenvalue of A b 29

  30. What about 3D (or higher dimensions)? Intuition : Regular n x n x n mesh Edges to 6 nearest neighbors Partition using planes General graphs: Need notion of “ well-shaped ” like a mesh n 3 | V | = n 2 | V s | = 2 2 3 ) = O ( | E | 3 ) = O ( | V | 30

  31. Random spheres “Separators for sphere packings and nearest neighbor graphs.” Miller, Teng, Thurston, Vavasis (1997), J. ACM Definition: A k-ply neighborhood system in d dimensions = set {D 1 , …,D n } of closed disks in R d such that no point in R d is strictly interior to more than k disks Example : 3-ply system 31

  32. Random spheres “Separators for sphere packings and nearest neighbor graphs.” Miller, Teng, Thurston, Vavasis (1997), J. ACM Definition: A k-ply neighborhood system in d dimensions = set {D 1 , …,D n } of closed disks in R d such that no point in R d is strictly interior to more than k disks Definition: An ( α ,k) overlap graph , for α >= 1 and a k-ply neighborhood: Node = D j Edge j → i if expanding radius of smaller Example : (1,1) overlap graph disk by > α causes two disks to overlap for a 2D mesh. 32

  33. Random spheres (cont’d) Theorem (Miller, et al.): Let G = (V, E) be an ( α , k) overlap graph in d dimensions, with n = |V|. Then there is a separator V s s.t.: V V 1 ∪ V s ∪ V 2 = d + 1 | V 1 | , | V 2 | d + 2 · n < � � d − 1 1 d · n | V s | α · k O = d In 2D, same as Lipton & Tarjan 33

  34. Random spheres: An algorithm Choose a sphere S in R d Edges that S “cuts” form edge separator E s Build V s from E s Choose S “randomly,” s.t. satisfies theorem with high probability 34

  35. Random spheres algorithm Partition 1: All disks inside S Partition 2: All disks outside S S Separator 35

  36. Choosing a random sphere: Stereographic projections Given p in plane, project to p’ on sphere. p’ 1. Draw line from p to north pole. 2. p’ = intersection. p = ( x, y ) p (2 x, 2 y, x 2 + y 2 − 1) p ′ = x 2 + y 2 + 1 36

  37. Random spheres separator algorithm (Miller, et al .) Do stereographic projection from R d to sphere S in R d+1 Find center-point of projected points Center-point c: Any hyperplane through c divides points ~ evenly There is a linear programming algorithm & cheaper heuristics Conformally map points on sphere Rotate points around origin so center-point at (0, 0, …, 0, r) for some r Dilate points: Unproject; multiply by sqrt((1-r)/(1+r)); project Net effect: Maps center-point to origin & spreads points around S Pick a random plane through the origin; intersection of plane and sphere S = “circle” Unproject circle, yielding desired circle C in R d Create V s : Node j in V s if if α *D j intersections C 37

  38. 38

  39. 39

  40. 40

  41. 41

  42. 42

  43. 43

  44. Summary: Nodal coordinate-based algorithms Other variations exist Algorithms are efficient: O(points) Implicitly assume nearest neighbor connectivity: Ignores edges! Common for graphs from physical models Good “initial guess” for other algorithms Poor performance on non-spatial graphs 44

  45. Partitioning without nodal coordinates 45

  46. A coordinate-free algorithm: Breadth-first search Choose root r and run BFS, which produces: root L0 Subgraph T of G (same nodes, N1 1 subset of edges) 2 T rooted at r 3 Level of each node = distance N2 4 from r Tree edges Horizontal edges Inter-level edges 46

  47. 47

  48. Kernighan/Lin (1970): Iteratively refine Given edge-weighted graph and partitioning: = ( V, E, W E ) G = | A | = | B | V A ∪ B, = { ( u, v ) ∈ E : u ∈ A, v ∈ B } E s � T ≡ cost( A, B ) w ( e ) ≡ e ∈ E s Find equal-sized subsets X, Y of A, B s.t. swapping reduces cost Need ability to quickly compute cost for many possible X, Y 48

  49. K-L refinement: Definitions Definition : “External” and “internal” costs of a ∈ A, and their difference; I ( a ) similarly for B: � E ( a ) w ( a, b ) ≡ E ( b ) E ( a ) ( a,b ) ∈ E s � w ( a, a ′ ) I ( a ) ≡ I ( b ) ( a,a ′ ) ∈ A D ( a ) E ( a ) − I ( a ) ≡ 49

  50. Consider swapping two nodes Swap X = { a } and Y = { b }: I ( a ) A ′ = ( A − a ) ∪ b B ′ = ( B − b ) ∪ a E ( b ) E ( a ) Cost changes: I ( b ) T ′ = T − ( D ( a ) + D ( b ) − 2 w ( a, b )) T − gain( a, b ) ≡ 50

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend