Geometric
going beyond Euclidean data
Deep Learning
Michael Bronstein
Imperial College London / Twitter
Geometric Deep Learning going beyond Euclidean data Michael - - PowerPoint PPT Presentation
Geometric Deep Learning going beyond Euclidean data Michael Bronstein Imperial College London / Twitter Perceptron ( " ! " ( # ! # + * = sign / 0 1 ( $ ! $ & = ( $)" 1 simplest neural network Rosenblatt 1957
Imperial College London / Twitter
!" !# !$
1 & = ($)" (" (# ($ * = sign /01
Rosenblatt 1957
simplest neural network
! " #$ %$
&'( )(
*$ "
+(
= *$sign #$!+%$
!
two-layer perceptron
! " #$ %$
&'( )(
*$ " !
+(
= *$sign #$!+%$
#2 %2 −*$
−*$sign #2!+%2
&'4 )4
! " " !
⋮ ⋮
Cybenko 1989; Hornik 1991
! " = $
% &
Γ %
&() 2%
Volume of ball inscribed in a unit hypercube
Figure: Vision Dummy
$ 1 2
&
≈ 0.785
2 3$ 1
2
&
≈ 0.52 ≈ 0.0159 ~6 78%
! " dimension "
to approximate a continuous function 9: ℝ% → ℝ with = accuracy
input image input vector
⋮ ⋮
input image input vector
⋮ ⋮
Fukushima 1980 LeCun et al. 1989 Hubel, Wiesel 1962; 1981
LeCun et al. 1989
Multilayer perceptrons “Universal approximators” CNNs Translation equivariance
LeCun et al. 1989
G-CNNs Group equivariance
Cohen, Welling 2016
Social networks Interaction networks Functional networks Meshes Molecules
Graph neural networks Relational inductive biases Graph representation learning
Manifolds Graphs
“Differential geometry and graph theory […] are insufficiently known in the signal processing community. One of our goals is to provide an accessible overview of these models”
1 2 3 4 5 6 7 8 9 10 11
Images Graphs § Constant number of neighbors § Fixed ordering of neighbors § Shift invariance § Different number of neighbors § No ordering of neighbors § Permutation invariance
Pooling? Convolution? Efficient computation?
!" !"#$ !"%$ &'
',$!$ + ⋯ + ) ',-!-
!" !"#$ !"%$ &'
',"%$!"%$ + ) ',"!"+ ) ',"#$!"#$
!" !"#$ !"%$ &'
!(#) !(%)
!(#) !(%)
!" #(%)
!" #(%)
shift operator shift operator
!" #(%)
!" #(%)
shift operator shift operator
$ %∗ ' ()
"
' (*
"
' (+,*
"
… % () (* (+,* … !" $ = ./01
+ )
⋱ ./01
+ +,*
(3 = 1 5 1 ./01
+ 3
⋮ ./01
+ +,* 3
inverse DFT forward DFT
!(#)
&!(#) '∗ ) *+
,
) *-
,
) *./-
,
… ' *+ *- *./- … inverse DFT forward DFT
!(#)
&!(#) = diag(+ ,- … + ,/) 0∗ 2 34
5
2 3-
5
2 3/6-
5
… 34 3- 3/6- … inverse DFT forward DFT
! " # " ⋆ # Circulant matrix
! " ⋆ $ % &' ⋱ % &) * $ + Element-wise product $ " ⋆ $ IDFT +∗ DFT
!∗ ! # $ !∗ ! Circulant matrix Element-wise product DFT
IDFT % $ ⋆ ' ( )* ⋱ ( ),
' $ ⋆ '
product, apply inverse DFT
! " log" using FFT
! = 1, … , &
ℰ = (, ) ∶ (, ) ∈ ! ⊆ !×! (directed)
! = 1, … , &
ℰ = (, ) ∶ (, ) ∈ ! ⊆ !×! (directed) ℰ = (, ) : (, ) ∈ ! ⊆ !×! (undirected)
! " = $ ∶ ", $ ∈ ℰ
)* = ! "
"
! ∶ # → ℝ& ' = !), … , !,
. ∶ ℰ → ℝ&0
!1 !2 .12
! ∶ # → ℝ& ' = !), … , !,
. ∶ ℰ → ℝ&0 particular case 1: ℰ → ℝ3 (weighted graph)
!4 !5 145
! " # ! " # ! " # $ $ $
! " # ! " # ! " # $ $ $
! " # ! " # ! " # $%& $%' ( ( ( $)%
1 2 3 4 5 6 7 8 9 10 11
+,- = 1 ⇔ 0, 2 ∈ ℰ
! " #$%
&' $ = '$ − 1 +$ ,
%∈. $
#$%'% “Local difference operator”
! " #$%
&' $ = 1 *$ +
%∈- $
#$% '$ − '% (normalized) Laplacian matrix & = /01 / − 2 = 3 − /01 2 Degree matrix / = *1 ⋱ *5
! " #$%
& ' = 1 *$ +
%∈- $
#$% .$ − .% = trace '6'7 measures how smooth the signal is on the graph
!" # = 1 &# '
(∈* #
cot.#( + cot0#( 2 "# − "(
Laplace-Beltrami operator Cotangent Laplacian
&# .#( 0#(
Duffin 1959; Pinkal, Polthier 1993; Desbrun et al. 1999 Laplace 1787; Beltrami 1902
3 4 Δ6 = div ∇6 ; ℳ =
>ℳ
local aggregation on adjacent nodes with shared parameters
Graph Fourier transform
1 2 3 ' …
Ortega et al. 2017
Laplacian " as the analogy of Fourier transform
diagonalizable) but different on graphs
eigenvectors
elaborate analysis using Jordan decomposition/generalized eigenvectors
Taubin 1995; Karni, Gotsman 2000; Levy 2006; B et al. 2010-2014 Shuman et al. 2013; Sandryhaila, Moura 2013
First eigenvectors of ring graph Laplacian = classical Fourier basis First eigenvectors of the Minnesota road network Laplacian
! ⋆ # = % & '( ⋱ & '* %+! In order to compute convolution ! ⋆ #
, ! = %+!
, ! ∘ & # = & '( ⋱ & '* , !
! ⋆ # = % , ! ∘ & # . /0 . / . /0 +. /2
vs ! " or ! "log"
vs ! 1
Original signal !
Filter output ! ⋆ # = % & '( ⋱ & '* %+!
Filter output ! ⋆ # = % & '( ⋱ & '* %+!
1 " 1 2 3 4
Grid Graph
2 3 4 5 6 "
3 " 1 2 3 4
Grid Graph
5 1 5 4 6
"
3 "
Graph
5 1 5 4 6
'( ) = 1 +) ,
0)- () − (-
Mesh Graph
1 " 2 3 4 5 6 3 " 5 1 5 4 6
Graph
6 " 1 2 3 4 5
Mesh
1 " 2 3 4 5 6
Anisotropic spectral filters on meshes
Boscaini et B 2016
! " # = % ! &' ⋱ ! &) %*#
spectral transfer function
avoid explicit computation of %
Levie et B 2019; Levie, Monti et B 2018
! " = $% + $'"' + ⋯ + $)")
Defferard et al. 2016
! " = $%& + $("( + ⋯ + $*"*
1 2 3 4 5 6 7 8 9 10 11
+ ℰ non-zeros
Defferard et al. 2016
! " = $%& + $("( + ⋯ + $*"*
1 2 3 4 5 6 7 8 9 10 11
Defferard et al. 2016
! " = $%& + $("( + ⋯ + $*"*
Defferard et al. 2016
1 2 3
$ …
!(#) %&'&
%)')
%*'*
Sandryhaila, Moura 2013
Node-wise features
1 2 3 4 5 6 7 8 9 10 11
Node-wise transform.
Kipf, Welling 2016
Node-wise features
1 2 3 4 5 6 7 8 9 10 11
Graph diffusion Node-wise transform.
Kipf, Welling 2016
Node-wise features
1 2 3 4 5 6 7 8 9 10 11
Graph diffusion Node-wise transform.
Kipf, Welling 2016
! "
Monti et B 2017; Veličković et al. 2018
*+ = ,
0+-1- attention score 0+- = exp ξ 1+(, 1-( ) ∑7∈/ + exp ξ 1+(, 17( )
! "
Gilmer et al. 2017 (MPNN); Wang et B 2018 (EdgeConv)
General aggregation function *+ = ,
1 2+, 2., 3+., )
Images Graphs § Convolution § Local operations (window) § Constant number of neighbours § Fixed ordering of neighbours § Shift equivariance § O(n) complexity § Message passing § Local operations (1-hop) § Different number of neighbours § No ordering of neighbours § Permutation invariance § O(n) complexity
!(#) = ! !(&)
Ying et al. 2018 (DiffPool)
!(#) = ! !(&) !(*)
Plot: Pau Rodríguez López
∆ between 2020 and 2019 in %
Vosoghi et el. 2018
Did 'Muslim Migrants' Attack a Catholic Church?
A video of pro-migrant protesters being removed from the Basilica of Saint-Denis in France was shared with the inflammatory and incorrect claim that it shows Muslim immigrants attacking a church.
Monti et B 2019
Input graph Reconstructed graph Node embedding Graph encoder Graph decoder
Jet image: LCH
Abdughani et al. 2018
GNN architecture for event graph classification
Ju et al. 2019
Choma et B 2018
ROC curve comparing different methods for neutrino detection Light deposition for a high-energy single muon in IceCube detector
Jin, Chen, He 2019
ROC of classifying light component from the background Graph-structured LHAASO-KM2A detectors activated by a 500-TeV EAS event (red=EM & blue=muon detectors)
Beck et al 2019
Predicted galaxy redshift from photometric observations using MoNet-style GNN vs groundtruth spectroscopic measurement
Choma et B 2018
ROC curve comparing different methods for neutrino detection Light deposition for a high-energy single muon in IceCube detector
Experiment
Graph NN
1060 1012 109 106 103
#Candidates
10-2 103 105
Computational cost
Synthesizable molecules
“Computational funnel”
Duvenaud et al. 2015; Gilmer et al. 2017; Jin et al. 2020 DFT Schrödinger Stokes et al. 2020
Veselkov et B 2019
Veselkov et B 2019
Veselkov et B 2019
Veselkov et B 2019
Zitnik et al 2018
Drug-Drug Interaction Protein-Protein Interaction Drug-Protein Interaction
PD-1 PD-L1
Gainza et B 2020
Predicted interface Cancer target
interface score
Protein database Top match based
similarity Predicted complex
Video: FaceShift 2015
Analysis Synthesis
Images: Faceshift
Masci et B 2015
Classical (extrinsic) convolution Geometric (intrinsic) convolution
Masci et B 2015; Monti et B 2017; Litany et B 2017; Verma et al. 2017
Correspondence between 3D shapes
Litany et B 2018; Ranjan et al. 2018; Bouritsas et B 2019; Gong et B 2019; Gong, Basri et B 2020
3D shape calculus in latent space
Kulon et B 2019; Kulon et B 2020
Mixed 2D/3D encoder/decoder model
Kulon et B 2019; Kulon et B 2020
Examples of reconstructed 3D hands in the wild
Video: Ariel AI 2020
(“relational inductive bias”)