co clustering documents and words using bipartite
play

Co-clustering documents and words using Bipartite Spectral Graph - PowerPoint PPT Presentation

Introduction Review of Spectral Graph Partitioning Bipartite Extension Summary Co-clustering documents and words using Bipartite Spectral Graph Partitioning Inderjit S. Dhillon Presenter: Lei Tang 16th April 2006 Inderjit S. Dhillon


  1. Introduction Review of Spectral Graph Partitioning Bipartite Extension Summary Co-clustering documents and words using Bipartite Spectral Graph Partitioning Inderjit S. Dhillon Presenter: Lei Tang 16th April 2006 Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  2. Introduction Problem Review of Spectral Graph Partitioning Bipartite Graph Model Bipartite Extension Duality of word and document clustering Summary The past work focus on clustering on one axis(either document or word) Document Clustering: Agglomerative clustering, k-means, LSA, self-organizing maps, multidimensional scaling etc. Word Clustering: distributional clustering, information bottleneck etc. Co-clustering simultaneous cluster words and documents! Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  3. Introduction Problem Review of Spectral Graph Partitioning Bipartite Graph Model Bipartite Extension Duality of word and document clustering Summary The past work focus on clustering on one axis(either document or word) Document Clustering: Agglomerative clustering, k-means, LSA, self-organizing maps, multidimensional scaling etc. Word Clustering: distributional clustering, information bottleneck etc. Co-clustering simultaneous cluster words and documents! Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  4. Introduction Problem Review of Spectral Graph Partitioning Bipartite Graph Model Bipartite Extension Duality of word and document clustering Summary � E ij , if there is an edge { i, j } Adjacency Matrix M ij = 0 , otherwise � Cut ( V 1 , V 2 ) = M ij i ∈ V 1 ,j ∈ V 2 G = ( D, W, E ) where D : docs; W : words; E : edges representing a word occurring in a doc. The adjacency matrix: � 0 A | D |×| W | � M = A T 0 No links between documents; No links between words Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  5. Introduction Problem Review of Spectral Graph Partitioning Bipartite Graph Model Bipartite Extension Duality of word and document clustering Summary � E ij , if there is an edge { i, j } Adjacency Matrix M ij = 0 , otherwise � Cut ( V 1 , V 2 ) = M ij i ∈ V 1 ,j ∈ V 2 G = ( D, W, E ) where D : docs; W : words; E : edges representing a word occurring in a doc. The adjacency matrix: � 0 A | D |×| W | � M = A T 0 No links between documents; No links between words Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  6. Introduction Problem Review of Spectral Graph Partitioning Bipartite Graph Model Bipartite Extension Duality of word and document clustering Summary Disjoint document clusters: D 1 , D 2 , · · · , D k Disjoint word clusters: W 1 , W 2 , · · · , W k Idea: Document clusters determine word clusters; word clusters in turn determine (better) document clusters. (seems familiar? recall HITS: Authorities/ Hub Computation) The “best” partition is the k-way cut of the bipartite graph. cut ( W 1 ∪ D 1 , · · · , W k ∪ D k ) = V 1 , ··· ,V k cut ( V 1 , · · · , V k ) min Solution: Spectral Graph Partition Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  7. Introduction Problem Review of Spectral Graph Partitioning Bipartite Graph Model Bipartite Extension Duality of word and document clustering Summary Disjoint document clusters: D 1 , D 2 , · · · , D k Disjoint word clusters: W 1 , W 2 , · · · , W k Idea: Document clusters determine word clusters; word clusters in turn determine (better) document clusters. (seems familiar? recall HITS: Authorities/ Hub Computation) The “best” partition is the k-way cut of the bipartite graph. cut ( W 1 ∪ D 1 , · · · , W k ∪ D k ) = V 1 , ··· ,V k cut ( V 1 , · · · , V k ) min Solution: Spectral Graph Partition Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  8. Introduction Minimum Cut Review of Spectral Graph Partitioning Weighted Cut Bipartite Extension Laplacian matrix Summary Eigenvectors 2-partition problem: Partition a graph (not necessarily bipartite) into two parts with minimum between-cluster weights. The above problem actually tries to find a minimum cut to partition the graph into two parts. Drawbacks: Always find unbalanced cut. Weight of cut is directly proportional to the number of edges in the cut. Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  9. Introduction Minimum Cut Review of Spectral Graph Partitioning Weighted Cut Bipartite Extension Laplacian matrix Summary Eigenvectors 2-partition problem: Partition a graph (not necessarily bipartite) into two parts with minimum between-cluster weights. The above problem actually tries to find a minimum cut to partition the graph into two parts. Drawbacks: Always find unbalanced cut. Weight of cut is directly proportional to the number of edges in the cut. Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  10. Introduction Minimum Cut Review of Spectral Graph Partitioning Weighted Cut Bipartite Extension Laplacian matrix Summary Eigenvectors An effective heuristic: WeightedCut ( A, B ) = cut ( A, B ) weight ( A ) + cut ( A, B ) weight ( B ) If weight ( A ) = | A | , then Ratio-cut ; If weight ( A ) = cut ( A, B ) + within ( A ), then Normalized-cut . cut ( A, B ) = w (3 , 4) + w (2 , 4) + w (2 , 5) weight ( A ) = w (1 , 3) + w (1 , 2) + w (2 , 3) + w (3 , 4) + w (2 , 4) + w (2 , 5) weight ( B ) = w (4 , 5) + w (3 , 4) + w (2 , 4) + w (2 , 5) Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  11. Introduction Minimum Cut Review of Spectral Graph Partitioning Weighted Cut Bipartite Extension Laplacian matrix Summary Eigenvectors An effective heuristic: WeightedCut ( A, B ) = cut ( A, B ) weight ( A ) + cut ( A, B ) weight ( B ) If weight ( A ) = | A | , then Ratio-cut ; If weight ( A ) = cut ( A, B ) + within ( A ), then Normalized-cut . cut ( A, B ) = w (3 , 4) + w (2 , 4) + w (2 , 5) weight ( A ) = w (1 , 3) + w (1 , 2) + w (2 , 3) + w (3 , 4) + w (2 , 4) + w (2 , 5) weight ( B ) = w (4 , 5) + w (3 , 4) + w (2 , 4) + w (2 , 5) Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  12. Introduction Minimum Cut Review of Spectral Graph Partitioning Weighted Cut Bipartite Extension Laplacian matrix Summary Eigenvectors Solution Finding the weighted cut boils down to solve a generalized eigenvalue problem: Lz = λWz where L is Laplacian matrix and W is a diagonal weight matrix and z denotes the cut. Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  13. Introduction Minimum Cut Review of Spectral Graph Partitioning Weighted Cut Bipartite Extension Laplacian matrix Summary Eigenvectors Laplacian Matrix for G ( V, E ):  � k E i k, i = j  L ij = − E ij , i � = jand there is an edge { i, j } 0 otherwise  Properties L = D − M . M is the adjacency matrix, D is the diagonal “degree” matrix with D ii = � k E ik L = I G I T G where I G is the | V | × | E | incidence matrix. For edge (i,j), I G is 0 except for the i-th and j-th entry which are � � E ij and − E ij respectively. L ˆ 1 = 0 x T Lx = � i,j ∈ E E ij ( x i − x j ) 1 ) T L ( αx + β ˆ 1 ) = α 2 x T Lx . ( αx + β ˆ Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  14. Introduction Minimum Cut Review of Spectral Graph Partitioning Weighted Cut Bipartite Extension Laplacian matrix Summary Eigenvectors Let p be a vector to denote a cut: � +1 , i ∈ A So p i = − 1 , i ∈ B p T Lp = E ij ( p i − p j ) 2 = 4 cut ( A, B ) � i,j ∈ E Introduce another vector q s.t.  � weight ( B ) + weight ( A ) , i ∈ A  q i = � weight ( A ) − weight ( B ) , i ∈ B  w A + w B p + w B − w A ˆ Then q = 2 √ w A w B 2 √ w A w B 1 ( w A + w B ) 2 q T Lq p T Lp L ˆ = ( as 1 = 0) 4 w A w B ( w A + w B ) 2 = · cut ( A, B ) w A w B Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  15. Introduction Minimum Cut Review of Spectral Graph Partitioning Weighted Cut Bipartite Extension Laplacian matrix Summary Eigenvectors Let p be a vector to denote a cut: � +1 , i ∈ A So p i = − 1 , i ∈ B p T Lp = E ij ( p i − p j ) 2 = 4 cut ( A, B ) � i,j ∈ E Introduce another vector q s.t.  � weight ( B ) + weight ( A ) , i ∈ A  q i = � weight ( A ) − weight ( B ) , i ∈ B  w A + w B p + w B − w A ˆ Then q = 2 √ w A w B 2 √ w A w B 1 ( w A + w B ) 2 q T Lq p T Lp L ˆ = ( as 1 = 0) 4 w A w B ( w A + w B ) 2 = · cut ( A, B ) w A w B Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

  16. Introduction Minimum Cut Review of Spectral Graph Partitioning Weighted Cut Bipartite Extension Laplacian matrix Summary Eigenvectors Property of q q T We = 0 q T Wq = weight ( V ) = w A + w B Then ( w A + w B ) 2 q T Lq · cut ( A, B ) w A w B = q T Wq w A + w B w A + w B = · cut ( A, B ) w A w B weight ( A ) + cut ( A, B ) cut ( A, B ) = weight ( B ) = WeightedCut ( A, B ) Inderjit S. Dhillon Presenter: Lei Tang Co-clustering documents and words using Biparti

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend