data reduction for maximum weight matchings in real world
play

Data Reduction for Maximum weight Matchings in Real World Graphs - PowerPoint PPT Presentation

Data Reduction for Maximum weight Matchings in Real World Graphs Pitta Venkat Indian Institute of Technology Madras cs16b017@smail.iitm.ac.in November 1, 2019 Pitta Venkat (IITM) Data Reduction November 1, 2019 1 / 12 Definitions and


  1. Data Reduction for Maximum weight Matchings in Real World Graphs Pitta Venkat Indian Institute of Technology Madras cs16b017@smail.iitm.ac.in November 1, 2019 Pitta Venkat (IITM) Data Reduction November 1, 2019 1 / 12

  2. Definitions and Notations Feedback Edge Set A set of Edges X in a Graph G is called feedback edge set if G − X is a forest. Feedback Edge Number Minimum Cardinality of all Feedback Edge Sets of Graph G is called Feedback Edge Number of G. Notations w ( G ):- Denotes weight of maximum weight Matching in Graph G . w ( M ):- Denotes weight of Matching M . Maximum weight Matching Problem :- Input:- weighted undirected Graph G and S . Output:- Yes iff ∃ Matching M in G s.t w ( M ) ≥ S . Pitta Venkat (IITM) Data Reduction November 1, 2019 2 / 12

  3. Definitions and Notations Parameterized problem A Set of Instances ( I , k ) where I ∈ Σ ∗ and K ∈ N is a parameter. Equivalance of Instances Instance ( I , K ) and Instance ( I 1 , K 1 ) of problem P are equivalent if ( I , K ) is an YES instance of P iff ( I 1 , K 1 ) is an YES instance of P . Kernelization and Data Reduction Rules Kernelization is an Algorithm that given an Instance ( I , K ) of parameterized problem P , computes in polynomial time equivalent Instance ( I 1 , K 1 ) s.t | I 1 | + K 1 ≤ f ( K ) for some computable function f . If f ( K ) ∈ K O (1) then we say P admits a polynomial Kernel. Kernelization is generally acheived by applying poly-time executable Data reduction Rules. A Rule R is correct if on applying R on ( I , K ) results ( I 1 , K 1 ) equivalent to ( I , K ) . Pitta Venkat (IITM) Data Reduction November 1, 2019 3 / 12

  4. Kernelization for Maximum cardinality Matching in unweighted graphs Reduction Rule 1 Let v ∈ V . If deg ( v ) = 0 then remove v from G . If deg ( v ) = 1 then remove v and it’s neighbour and decrease S by 1. Reduction Rule 2 Let v ∈ V . If deg ( v ) = 2 and u , w are neighbour’s of v then remove v and merge u , w and decrease S by 1. Theorm1 Maximum cardinality Matching admits linear-time computable linear-size Kernel with respective to parameter as feedback edge number. Result from Theorm1 Total time taken to compute Maximum cardinality Matching is O ( m + n + K 1 . 5 ). Pitta Venkat (IITM) Data Reduction November 1, 2019 4 / 12

  5. Kernelization for Maximum weight Matching Reduction Rule 1 Let v ∈ V . If deg ( v ) = 0 then remove v . Or Let e ∈ E If w ( e ) = 0 then remove e . Reduction Rule 2 Let v ∈ V . If deg ( v ) = 1 and u is neighbour of v then ∀ e ∈ E s.t u incident on e set weight of edge as max { 0 , w ( e ) − w ( uv ) } , remove v and decrease S by w ( uv ) Lemma1 Rule 1 and Rule 2 are correct. Lemma2 Rule 1 and Rule 2 can be applied exhaustively in linear time. Pitta Venkat (IITM) Data Reduction November 1, 2019 5 / 12

  6. Proof of Lemma1 Correctness of Rule 1 :- is simple to see because by applying Rule 1 Matchings doesnot effected. Correctness of Rule 2 :- Let us consider v be with deg ( v ) = 1 and uv ∈ E , G 1 is the graph obtained by applying Rule 2. If ∃ Matching M s.t w ( M ) ≥ S . Let X be set of all edges incident on u , v and M 1 = M − X . M 1 exists in G 1 with same weight. if uv ∈ M then w ( M 1 ) ≥ S − w ( uv ). if e ∈ X − uv ∈ M then w G 1 ( M 1 + e ) ≥ S − w ( e ) + Max ≥ S − w ( uv ). Hence ∃ Matching M 11 s.t w G 1 ( M 11 ) ≥ S − w ( uv ). We can also proove other way also by following similar procedure. Hence Rule 2 is correct. Pitta Venkat (IITM) Data Reduction November 1, 2019 6 / 12

  7. Proof of Lemma2 Rule 1 can be exauhstively applied in linear time by collecting all vertices of 0 degree and all edges of 0 weight in one reading. We will give an Algorithm to show Rule 2 can be exauhstively applied in linear time as follows. There are counters for each vertex initialised to 0. weight of edge xy at any iteration is w ( xy ) − c ( x ) − c ( y ). Each time when the Rule 2 is applied Let deg ( x ) = 1 and y is neighbour of x we will decrease S by max { 0 , w ( xy ) − c ( x ) − c ( y ) } and c ( y ) = c ( y ) + max { 0 , w ( xy ) − c ( x ) − c ( y ) } . So we will first collect all vertices of degree 1 in O ( n + m ) time. We will go through each vertex we collected and apply Rule 2. After that change all weights of edges as w ( xy ) = w ( xy ) + c ( x ) + c ( y ). correctness of Algorithm can be easily varified. Pitta Venkat (IITM) Data Reduction November 1, 2019 7 / 12

  8. Definitions and Reduction Rules of degree 2 vertices Maximal path Let G be a Graph. A path P = v 0 v 1 .. v l is a Maximal path in G if l ≥ 3. and deg ( V 1 ) = deg ( v 2 ) = .. = deg ( v l − 1 ) = 2, deg ( v 0 ) � = 2, deg ( v l ) � = 2. Pending cycle Let G be the Graph. A Cycle C = c 0 c 1 .. c l is a pending cycle if atmost one vertex in cycle has degree greater than 2. Reduction Rule 3 Let G be a graph and C be a pending cycle in G where u ∈ C with deg ( u ) ≥ 3.Then replace the Cycle by an edge uz with w ( uz ) = w ( C ) − w ( C − u ) and decrease S by w ( C − u ). Pitta Venkat (IITM) Data Reduction November 1, 2019 8 / 12

  9. Reduction Rules and Lemma’s Reduction Rule 4 Let G be a graph. P be a maximal path with end points u , v then remove all vertices in P execpt u , v and add vertex z . s.t w ( uz ) = w ( P − v ) − w ( P − u − v ), w ( vz ) = w ( P − u ) − w ( p − u − v ) and w ( uv ) = max { w ( uv ) , w ( P ) − w ( P − u − v ) } . and decrease S by w ( P − u − v ) Lemma 3 Maximal paths in a Graph is atmost feedback edge number. Lemma 4 Reduction Rules 3 doesnot increase feedback edge number and Reduction Rule 4 can increase atmost double. Lemma 5 Rules 3,4 are correct and can be applied exhaustively in linear time. Pitta Venkat (IITM) Data Reduction November 1, 2019 9 / 12

  10. Kernel Size for Maximum weight Matching Kernelization for Maximum weight Matching is exhaustively applying Rule 1 , Rule 3 , Rule 4 and Rule 2 in Order. Let G 1 be the final graph obtained K 1 ≤ 2 ∗ K because feedback edge number doesnot increase in any rule. Let X is minimum size feedback edge set in G 1 . Consider the graph G 1 − X . Devide the vertices into V 1 , V 2 , V 3 represents vertices of degree 1 , degree 2 and degree ≥ 3 respectively in G 1 − X . V 3 < V 1 (2 V 1 + 2 V 2 + 2 V 3 − 2 ≤ V 1 + 2 V 2 + 3 V 3 ). in G 1 there are no degree 1 vertices so in G 1 − X degree 1 vertices are obtained by removing X so V 1 < 4 ∗ K . degree 2 vertices in G are incident to X or V 3 + 4 ∗ K so V 2 < 12 ∗ K . Total vertices < 4 ∗ K + 4 ∗ K + 12 ∗ K = 20 ∗ K and Total Edges < 20 ∗ K + 2 ∗ K in G 1 . So the Kernel as atmost 20 ∗ K vertices and atmost 22 ∗ K edges. Pitta Venkat (IITM) Data Reduction November 1, 2019 10 / 12

  11. Correctness of Rules 3 and 4 First we consider the Instances before and After applying Reduction Rules From Matchings in G we will remove edges which are different from both instances and add some other edges We will Obtain relation ship like w ( G 1 ) ≥ w ( G ) − w ( C − u ) and w ( G 1 ) ≥ w ( G ) − w ( P − u − v ). From Matchings in G 1 we will remove edges which are different from both instances and add some other edges We will Obtain relation ship like w ( G ) ≥ w ( G 1 ) + w ( C − u ) and w ( G ) ≥ w ( G 1 ) + w ( P − u − v ). Hence we can conclude equivalance of 2 Instances. Pitta Venkat (IITM) Data Reduction November 1, 2019 11 / 12

  12. Conclusion Comparing Kolmogorov’s Algorithm with and without data Reductions the average speedup is 3800% in case of unweighted graphs. Comparing Kolmogorov’s Algorithm with and without data Reductions the average speedup is 30% in case of weighted graphs. because weighted graphs can reduce only 50% size on average where unweighted graphs can reduce 80% size. Hence weighted graph Kernelization is not as good as unweighted Graph Kernelization Pitta Venkat (IITM) Data Reduction November 1, 2019 12 / 12

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend