graph convolutional networks
play

Graph Convolutional Networks - PDF document

Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Thanks for joining me for a presentation on ... Graph Convolutional Networks http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 1 of 30


  1. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Thanks for joining me for a presentation on ... Graph Convolutional Networks http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 1 of 30

  2. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Your Presenter: Christian McDaniel Data Scientist & Software Engineer, Graph Convolutional Networks Background Graph Convolutional Networks are both simple and complex They borrow from multiple domains to arrive at an elegant analysis algorithm Deep Learning - Convolutional Neural Networks Spectral Graph Clustering - Graph Laplacian Signal Processing - Fourier Transform http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 2 of 30

  3. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Graph Convolutional Networks Background Imagine a dataset: Made up many points Each point can be described by p features and falls into one of c classes . These points are interconnected . http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 3 of 30

  4. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Graph Convolutional Networks Graph Convolutional Networks Background - Convolutional Neural Networks _ _ + Great success with computer vision-based applications + Some advantages of CNN's: + Computational efficiency ( ∼ O(V + E)) + ︎ Fixed number of parameters (independent of input size) + Localisation: acts on a local neighborhood + Learns the importance of different neighbors + Images have highly regular connectivity pattern + each pixel is "connected" to its eight neighboring pixels + Convolving a kernel matrix across the "nodes" is trivial http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 4 of 30

  5. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Images as Well-Behaved Graphs Images as Well-Behaved Graphs _ _ http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 5 of 30

  6. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Images as Well-Behaved Graphs _ _ _ Graph Convolutional Networks Generalizing the convolution operation for arbitrary graph structures is much more tricky We will use some very convenient (and awesome) rules from Signal Processing and Spectral Graph Theory Next we'll discuss recent advances improving performance and computational efficiency (Semi-Supervised Classification with Graph Convolutional Networks, Kipf & Welling 2016) http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 6 of 30

  7. ̃ ̃ Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Graph Convolutional Networks Before we learn the ituitions, let's first look at what a simple GCN may look like: The graph-based convolution: − 1 − 1 2 퐴 ̃ 퐷 푋 Θ 푍 = 퐷 2 Add in a Nonlinear Activation Graph structure is encoded directly into the neural network model by incorporating the adjacency matrix: 푓 ( 푋 , 퐴 ) We can do so by wrapping the above equation in a nonlinear activation function: − 1 − 1 2 퐴 ̃ 퐷 ̃ 2 퐻 푙 푊 푙 퐻 ( 푙 +1) = 휎 ( ) 퐷 ̃ Where is a layer-specific trainable weight matrix, is an activation function, 푊 푙 휎 ( ⋅ ) ℝ 푁푥퐹 푙 퐻 푙 is the matrix of activations from the 푙 푡ℎ layer, and 퐻 0 ∈ = 푋 A two-layer Graph Convolutional Network may look something like 푊 0 푊 1 푍 = 푓 ( 푋 , 퐴 ) = softmax( ReLU( 푋 퐴 ̂ 퐴 ̂ ) ) − 1 − 1 Where is the precalculated 퐴 ̂ 퐷 퐴 퐷 2 2 So... Where did this implimentation come from?? http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 7 of 30

  8. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM 1. Generalizing the Convolution A convolution operation combines one function with another function such 푓 푔 that the first function is transformed by the other In Convolutional Neural Networks, the second function is learned for a given data 푔 set so that the transformations on are meaningful w.r.t. some class values 푓 e.g., a set of pixels showing a dog, may be transformed to values near 푓 푑 0 while a set of pixels showing a cat, may be transformed to values near 1 푓 푐 Let's see how we might do this for our graph-based data... http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 8 of 30

  9. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Graph Convolutional Networks Signal Processing With the nodes of the graph representing individual examples from the dataset, And some data at each node E.g., scalar intensity values at each pixel of an image feature vectors for higher dimensional data We can consider the data values as signals on each node http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 9 of 30

  10. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Graph Convolutional Networks Signal Processing With the nodes of the graph representing individual examples from the dataset, And some data at each node Graph Convolutional Networks Signal Processing With the nodes of the graph representing individual examples from the dataset, And some data at each node scalar intensity values at each pixel of an image feature vectors for higher dimensional data We can consider the data values as signals on each node The changing of the signals across the edges of the graph resemble the fluctuation of a signal over time Borrowing from signal processing, these oscillations could be characterized by their component frequencies, via the Fourier transform It just so happens that graphs have their own version of the FT... http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 10 of 30

  11. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Graph Convolutional Networks From Spectral Graph Theory ... The Normalized Graph Laplacian _ _ − 1 − 1 퐿 = 퐼 − 퐷 퐴 퐷 2 2 + Where is the diagonal identity matrix 퐼 is the diagonal degree matrix 퐷 and is the weighted adjacency matrix 퐴 Graph Convolutional Networks The Normalized Graph Laplacian in the Spectral Domain The Normalized Graph Laplacian is a real symmetric positive semidefinite matrix 퐿 푧 푇 for all columns z 퐿푧 ≥ 0 푢 푙 } 푛 − 1 ℝ 푛 it has a complete set of orthonormal eigenvectors a.k.a. the ∈ { 푙 =0 graph Fourier modes { λ 푙 } 푛 − 1 and it has the associated ordered real nonnegative eigenvalues , 푙 =0 the frequencies of the graph ℝ 푛푥푛 is diagonalized in the Fourier basis such that , . . . , 푢푛 − 1] ∈ 퐿 푈 = [ 푢 0 푢 1 , 푈 푇 Λ 푈 퐿 = ℝ 푛푥푛 Where Λ = diag([ ] ∈ 휆 0 휆 1 , , . . . , 휆 푛 − 1 ) I.e., 퐷 − 1/2 퐷 − 1/2 퐿 = 퐼 − 푈 푇 Λ 푈 = 퐴 http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 11 of 30

  12. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Graph Convolutional Networks The Normalized Graph Laplacian in the Spectral Domain 퐷 − 1/2 퐷 − 1/2 푈 푇 퐿 = 퐼 − Λ 푈 퐴 = The Graph Fourier Transform The Fourier transform in the graph domain, where eigenvectors denote Fourier modes and eigenvalues denote frequencies of the graph is defined 푈 푇 ℝ 푛 푥 ∈ 푥 ̂ = As on Euclidean spaces, this transform enables the formulation of fundamental operations. E.g., the convolution operation becomes multiplication, and we can define a convolution of a data vector with a filter as 푥 푔 Θ ∗ 푥 = 푈 (( 푈 푇 푔 Θ ) ⨀ ( 푈 푇 ( 푈 Λ 푈 푇 푥 )) = ) 푥 푔 Θ 푔 Θ http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 12 of 30

  13. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Graph Convolutional Networks The Normalized Graph Laplacian in the Spectral Domain 퐷 − 1/2 퐷 − 1/2 퐿 = 퐼 − 푈 푇 Λ 푈 = 퐴 The Graph Fourier Transform 푈 푇 푔 Θ 푈 푇 푈 푇 ∗ 푥 = 푈 (( ) ⨀ ( ( 푈 Λ 푔 Θ 푥 )) = 푔 Θ ) 푥 Graph Convolutional Networks The Normalized Graph Laplacian in the Spectral Domain 퐷 − 1/2 퐷 − 1/2 푈 푇 퐿 = 퐼 − Λ 푈 퐴 = The Graph Fourier Transform 푈 푇 푔 Θ 푈 푇 푈 푇 ∗ 푥 = 푈 (( ) ⨀ ( ( 푈 Λ 푔 Θ 푥 )) = 푔 Θ ) 푥 Signal Processing 푈 푇 퐺 ̂ 푈 푇 ∗ 푥 = ( 푈 Λ 푔 Θ 푔 Θ ) 푥 = 푈 푥 As computing the eigenspectrum of a matrix can be computationally exepensive, we can approximate the Fourier coefficients using a Kth order Chebychev Polynomial ∑ 퐾 ′ ≈ Θ ′ 푘 푇 푘 Λ̃ 푔 Θ ( ) 푘 =0 Where rescaled 2 Λ̃ Λ − 퐼 = 휆 푚푎푥 http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 13 of 30

  14. Graph_Convolutional_Networks slides 12/11/19, 8(56 AM Graph Convolutional Networks The Normalized Graph Laplacian in the Spectral Domain 퐷 − 1/2 퐷 − 1/2 퐿 = 퐼 − 푈 푇 Λ 푈 = 퐴 The Graph Fourier Transform 푈 푇 푔 Θ 푈 푇 푈 푇 ∗ 푥 = 푈 (( ) ⨀ ( ( 푈 Λ 푔 Θ 푥 )) = 푔 Θ ) 푥 Signal Processing 푈 푇 퐺 ̂ 푈 푇 ∗ 푥 = ( 푈 Λ 푔 Θ 푔 Θ ) 푥 = 푈 푥 ∑ 퐾 ≈ 푘 =0 Θ ′ 푘 푇 푘 Λ̃ 푔 ′ ( ) Θ This lets us define ∑ 퐾 ∗ 푥 ≈ 푘 =0 Θ ′ 푔 Θ 푘 푇 푘 퐿 ̃ ( ) 푥 with 2 퐿 − 퐼 = 퐿 ̃ 휆 푚푎푥 The convolution is now -localized, operating only on the nodes a distance of 퐾 퐾 away from any given node http://127.0.0.1(8000/Graph_Convolutional_Networks.slides.html?print-pdf#/ Page 14 of 30

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend