deep learning on graphs and manifolds
play

Deep Learning on Graphs and Manifolds 1 Yuan YAO HKUST Based on - PowerPoint PPT Presentation

Deep Learning on Graphs and Manifolds 1 Yuan YAO HKUST Based on Xavier Bresson et al. Acknowledgement A following-up course at HKUST: https://deeplearning-math.github.io/ Non-Euclidean Data? = Gr Graphs/ Ne Netwo works Al Also


  1. Deep Learning on Graphs and Manifolds 1 Yuan YAO HKUST Based on Xavier Bresson et al.

  2. Acknowledgement A following-up course at HKUST: https://deeplearning-math.github.io/

  3. Non-Euclidean Data? = Gr Graphs/ Ne Netwo works Al Also chemistry, physics, social science, communication netwo works, etc.

  4. Graphs and Manifolds Graphs Manifolds

  5. Social Networks as Graphs and Features on Edges and Vertices edge friendship frequency ... vertex gender age ... Domain structure Data on a domain

  6. Graphs and Manifolds may vary Community Detection 3D shapes (di ff erent manifolds) Molecule graph

  7. Challenges ´ What geometric structure in images, speech, video, text, is exploited by CNNs ? ´ How to leverage such structure on non-Euclidean domains ?

  8. Convolutional Networks on Euclidean Domain (e.g. LeNet for Images) An An architecture for hi high-di dimens nsiona nal learni ning ng : Cu Curse of dimensionality : x 1024 ≈ 10 di dim(image) = 1024 x 10 6 000 po 10 1, 1,000, 000,000 For N=10 samples/dim ⇒ 10 Fo point nts Co ConvNe Nets are po powerful ul to to solve e high-di dimens nsiona nal learni ning ng pr probl blems.

  9. ConvNets on Euclidean Domains Ma Main assumption : Da Data (ima mage, video, sound) is com compos ositi tion onal al, it , it is is fo formed o of p f patterns t that a are: Lo Local St Stationa nary Mu Multi-sc scale (hierarchical) Co ConvNe Nets le leverage th the e co compositi tionality ty str tructu cture e : Th They extract compositional features and feed them to classifier, re recommender, r, etc etc (en (end-to to-en end). ). Co Computer Vision NL NLP Dr Drug disc scovery Ga Games

  10. Key Property: Locality Lo Locality y : Pr Property inspired by the human visual cortex system. Local recept Lo ptive ve fields ds (H (Hubel el, Wies esel el 1962) ) : Activate in Ac in t the p presence o of f lo local fe features. Ne Neocognitron Fuk Fukus ushi hima 1980 ier Bresson 9

  11. Key Property: Stationarity (Invariance) St Stationa narity y ⇔ Tr Translation invariance Gl Global invariance y ⇔ Si Local stationa Lo narity Similar pa patche hes are sha hared d acr acros oss the e dat ata a dom omai ain Lo Local inva nvarianc nce, e , essentia ial fo l for in intra-cl clas ass va variations ns

  12. Key Property: Multiscale Representation Mu Multi-sc scale : Si Simpl ple st structures s combine to compose se sl slightly more ab abstract act st structures, a , and s so o on, , in in a a h hie ierarchic ical w l way. In Inspir ired b by b brain in vi visua ual pr primary y cortex x (V1 V1 and V2 V2 neurons). Fe Featur ures learne ned d by by Co Conv nvNet be become inc ncreasing ngly more compl plex at de deepe per layers (Zeiler Zeiler, , Ferg ergus 2013)

  13. How to avoid the curse of dimensionality? Lo Locality y : Co Compact support kernels ⇒ O( O(1) parame meters per filter. pe St Stationa narity y : Co Convolutional operators ⇒ O( O( n lo log n ) ) in gen gener eral al (FFT) an and O( n ) ) for co compact ct ker ernel els. Mu Multi-sc scale : Do Downsamp mpling + + pooling ⇒ O( O( n ) Bresson

  14. Implementation: Compositional Maps = l -th image feature (R,G,B channels), dim( f l ) = n × 1 f l g ( k ) l -th feature map, dim( g ( k ) ) = n ( k ) = × 1 l l l Compositional features consist of multiple convolutional + pooling layers. q k � 1 q k � 2 !!! g ( k ) W ( k ) W ( k � 1) X X · · · f l 0 Convolutional layer = ⇠ l,l 0 ? ⇠ ? ⇠ l l,l 0 l 0 =1 l 0 =1 Activation, e.g. ⇠ ( x ) = max { x, 0 } rectified linear unit (ReLU) ( x 0 ) : x 0 2 N ( x ) k p g ( k ) ( x ) = k g ( k � 1) Pooling p = 1 , 2 , or 1 l l

  15. Summary of ConvNets Fi Filters localized in space (lo localit lity) Co Convolutional filters (st stationarity) Mu Multiple layers (mu multi-sc scale) O( O(1) pa parameters pe per filter (inde ndepe pende ndent nt of input nput image size n ) O( O( n ) com complex exity ty per er lay ayer er (f (filter tering g don one e in th the e spati atial al dom omai ain)

  16. Generalization to ConvNets on Graphs? Ho How w to extend Co ConvNe Nets to gr grap aph-st structured data? As Assumption : No Non-Eu Euclidean da data is locally y stationa nary y and nd mani nifest hi hierarchi hical struc uctur ures. Ho How w to define com compos ositi tion onal ality ty on on gr grap aphs? ? (con (convol oluti tion on an and pool ooling g on on gr grap aphs) Ho How w to make them fa fast? ? (l (linear ear com complex exity ty)

  17. Next: ´ Prof. Xavier Bresson , NTU ´ IPAM talk on Convolutional Neural Networks on Graphs ´ https://www.youtube.com/watch?v=v3jZRkvIOIM ´ Prof. Zhizhen ZHAO , UIUC ´ Seminar: Multi-Scale and Multi-Representation Learning on Graphs and Manifolds

  18. Thank you!

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend