empirical comparison of approximate inference algorithms
play

Empirical Comparison of Approximate Inference Algorithms for - PowerPoint PPT Presentation

Empirical Comparison of Approximate Inference Algorithms for Networked Data Prithviraj Sen Lise Getoor Department of Computer Science University of Maryland, College Park. Workshop on Open Problems in Statistical Relational Learning, 2006


  1. Empirical Comparison of Approximate Inference Algorithms for Networked Data Prithviraj Sen Lise Getoor Department of Computer Science University of Maryland, College Park. Workshop on Open Problems in Statistical Relational Learning, 2006 Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 1/16

  2. Introduction Recent, widespread interest in structured classification. Numerous approximate inference algorithms for networked data exist. We empirically compare three of the most popular ones: Iterative Classification Algorithm Mean Field Relaxation Labeling Loopy Belief Propagation Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 2/16

  3. Parameters of Interest Performance on random graph data. Effects of noise in attribute values. Effects of noise in correlations across links. Effects of varying link density. Effects of different link patterns. Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 3/16

  4. Iterative Classification Algorithm (ICA) Simple, greedy, iterative algorithm. Introduced by Besag b i ( y ) ← [Besag, 1986]. � αφ i ( y ) exp { w y , y j } Y j ∈N ( Y i ) In each iteration, for each node, looks at neighbourhood class labels. Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 4/16

  5. Mean Field Relaxation Labeling (MF) Soft-version of ICA. Many other versions exist. Discovered by vision community b i ( y ) ← [Hummel & Zucker, 1983]. � w y , y ′ b j ( y ′ ) } αφ i ( y ) exp { Y j ∈N ( Y i ) , y ′ In each iteration, for each node, looks at neighbour’s label distribution. Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 5/16

  6. Loopy Belief Propagation (LBP) Message-passing algorithm. Attempts to stop sending messages in loops. m i → j ( y ) ← Discovered by iterative � φ i ( y ′ ) e w y , y ′ � m k → i ( y ′ ) α decoding community y ′ Y k ∈N ( Y i ) \ Y j [Kschischang & Frey, 1998, b i ( y ) ← McEliece et al, 1998, � αφ i ( y ) m j → i ( y ) Kschischang et al, 2001]. Y j ∈N ( Y i ) Messages computed without considering destination node’s message. Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 6/16

  7. Synthetic Graph Generation Algorithm Based on power-law graph generation algorithm [Bollobas et al, 2003] 1: Begin with a single node graph G . 2: repeat With probability α introduce an edge in G 3: With probability 1 − α introduce a new node with a randomly sampled 4: label, connect new node to G 5: until size of G = n 6: generate attributes for all nodes. Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 7/16

  8. Preferential attachment scheme used When choosing node to link node ν to: 100000 With probability ρ Degree distribution exp(12-3*log(x)) choose node with same 10000 label. Frequency 1000 With probability 1 − ρ 100 choose node with 10 different label. 1 1 10 100 Preference given to Degree nodes with high degree. Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 8/16

  9. Experimental setup Performed 3-fold cross validation. Metric used: avg. classification accuracy. Compared three models: ICA , MF , LBP Performed experiments on binary class data. Parameters of interest α : controls number of edges ρ : controls degree of correlation across edges. ω : controls noise in attribute values. Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 9/16

  10. Varying Correlations across Links Varying link noise with α = 0.1 Varying link noise with α = 0.3 100 100 LBP LBP MF MF 90 90 ICA ICA avg. accuracy (%) avg. accuracy (%) 80 80 70 70 60 60 50 50 40 40 0.5 0.6 0.7 0.8 0.9 1 0.5 0.6 0.7 0.8 0.9 1 ρ ρ Varying link noise with α = 0.5 100 LBP MF 90 ICA avg. accuracy (%) 80 70 60 50 40 0.5 0.6 0.7 0.8 0.9 1 ρ Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 10/16

  11. Varying Correlations across Links – contd. Varying link noise with α = 0.1 Varying link noise with α = 0.3 100 100 90 90 avg. accuracy (%) avg. accuracy (%) 80 80 70 70 60 60 LBP LBP 50 50 MF MF ICA ICA 40 40 0 0.1 0.2 0.3 0.4 0.5 0 0.1 0.2 0.3 0.4 0.5 ρ ρ Varying link noise with α = 0.5 100 90 avg. accuracy (%) 80 70 60 LBP 50 MF ICA 40 0 0.1 0.2 0.3 0.4 0.5 ρ Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 11/16

  12. Varying Attribute Noise Varying attr. noise with α = 0.1 Varying attr. noise with α = 0.1 100 100 90 90 avg. accuracy (%) avg. accuracy (%) 80 80 70 70 60 60 LBP LBP MF MF 50 50 ICA ICA 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ω ω Varying attr. noise with α = 0.1 100 90 avg. accuracy (%) 80 70 60 LBP MF 50 ICA 0 0.2 0.4 0.6 0.8 1 ω Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 12/16

  13. Varying Attribute Noise – contd. Varying attr. noise with α = 0.1 Varying attr. noise with α = 0.3 100 100 80 80 avg. accuracy (%) avg. accuracy (%) 60 60 40 40 20 LBP 20 LBP MF MF ICA ICA 0 0 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ω ω Varying attr. noise with α = 0.5 100 80 avg. accuracy (%) 60 40 20 LBP MF ICA 0 0 0.2 0.4 0.6 0.8 1 ω Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 13/16

  14. Effect of different link patterns In the case of Homophily or Perfect Assortative Mixing (figure on left), the generated graphs form densely connected clusters introducing closed loops hampering LBP and MF. Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 14/16

  15. Conclusion We empirically compared three of the most popular approximate inference techniques for networked data. MF tends to get stuck at local minima in a variety of cases, e.g., high link correlation, high link density. LBP tends to face issues in the presence of high link density and a specific type of link pattern known as Homophily or Perfect Assortative Mixing but otherwise performs well. We found that LBP’s convergence does not necessarily indicate good results. ICA is the most consistent of the three approaches considered, returning reasonable results in a wide variety of conditions. Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 15/16

  16. References J. Besag, On the statistical analysis of dirty pictures” , Journal of the Royal Statistical Society, 1986. R. Hummel and S. Zucker, On the foundations of relaxation labeling processes , IEEE Trans. on Pattern Analysis and Machine Intelligence, 1983. F. R. Kschischang and B. J. Frey, Iterative decoding of compound codes by probability progation in graphical models , IEEE Journal on Selected Areas in Communication, 1998. F. R. Kschischang and B. J. Frey and H. A. Loeliger, Factor graphs and the sum-product algorithm , IEEE Trans. on Information Theory, 2001. R. J. McEliece and D. J. C. MacKay and J. F. Cheng, Turbo decoding as an instance of Pearl’s belief propagation algorithm , IEEE Journal on Selected Areas in Communication, 1998. B. Bollobas, C. Borgs, J. T. Chayes and O.Riordan, Directed scale-free graphs , In Proceedings of ACM-SIAM Symposium on Discrete Algorithms, 2003. Empirical Comparison of Approximate Inference Algorithms for Networked Data –Prithviraj Sen, Lise Getoor 16/16

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend